Article

Estimating the Innovator's Dilemma: Structural Analysis of Creative Destruction in the Hard Disk Drive Industry

Authors:
  • Tuck School of Business at Dartmouth College
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Why do incumbent firms innovate more slowly than entrants? This incumbent-entrant timing gap is the key to understanding the industry dynamics of "creative de-struction." Theories predict cannibalization between existing and new products delays incumbents' innovation, whereas preemptive motives accelerate it, and incumbents' cost (dis)advantage would further reinforce these tendencies. To empirically quantify these three forces, I develop and estimate a dynamic oligopoly model using a unique panel dataset of hard disk drive (HDD) manufacturers (1981–98), which I constructed from industry publications. The results suggest that despite strong preemptive motives and a substantial cost advantage over entrants, incumbents are reluctant to innovate because of cannibalization, which can explain at least 51% of the timing gap. I then discuss managerial implications of the findings, as well as welfare consequences of broad patents, trade barriers, and other competition/innovation policies. Yang for suggestions. I thank Minha Hwang for sharing engineering expertise and managerial insights into the manufacturing processes. I thank James Porter, the editor of DISK/TREND Reports, for sharing his encyclopedic industry knowledge and for making the reports available. I thank Clayton Christensen for inspiration and encouraging a new approach to the innovator's dilemma. Previous versions of the paper were presented at IIOC, TADC, and REER. Financial support from the the Nozawa Fellowship, the UCLA CIBER, and the Dissertation Year Fellowship is gratefully acknowledged. An earlier version of the paper received Best Student Paper Award at the 11th Annual REER at Georgia Tech.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In the current practice of sensitivity analysis, researchers typically re-estimate the model at a few (e.g., three) neighboring values of the fixed parameter used for the main analysis because estimating the model once is costly. For instance, Barwick and Pathak (2015), Fowlie et al. (2016), and Igami (2017) repeat the estimation using discount factors around the one used for the main analysis, and examine how the parameter estimates change with the discount factor. Although monotonic patterns are usually shown, they might not necessarily generalize to the entire support of the discount factor. ...
... Here, I allow γ to be a vector because researchers might fix multiple parameters. For instance, Igami (2017) calibrates the discount factor, the rate of change of innovation cost, and the number of potential entrants. ...
... Sensitivity analysis with respect to calibrated parameters is conducted by resolving the optimization problem with different values of σ s . Igami (2017) studies creative destruction in the hard disk drive industry using a dynamic discrete game model. The model is solved using the nested fixed-point approach. ...
Preprint
In dynamic discrete choice models, some parameters, such as the discount factor, are being fixed instead of being estimated. This paper proposes two sensitivity analysis procedures for dynamic discrete choice models with respect to the fixed parameters. First, I develop a local sensitivity measure that estimates the change in the target parameter for a unit change in the fixed parameter. This measure is fast to compute as it does not require model re-estimation. Second, I propose a global sensitivity analysis procedure that uses model primitives to study the relationship between target parameters and fixed parameters. I show how to apply the sensitivity analysis procedures of this paper through two empirical applications.
... In the first sector-specific study, we focus on the Hard Disk Drive (HDD) industry. Taking advantage of prior research that defines the emergence of new technologies and the obsolescence of old ones (Christensen, 1997;Igami, 2017), we show that our measure captures this evolutionary process closely-patents associated with the old HDD generation have higher obsolescence when the new generation emerges. In the second study, we document that arrivals of radical innovation, as defined in Kelly et al. (2021), are followed by technological obsolescence of disrupted firms and industries. ...
... Our construction-by using the base excluding f 's own patents and tracking only citations not made by f -is closer to capturing the obsolescence driven by movements of technology fields themselves. 10 This industry has been an innovation economist's favorite for a few decades (Christensen, 1997;Igami, 2017), for a few reasons. First, it is an important sector in the computer industry that has been innovation-intensive since the late 1970s. ...
... Sources of Technological Obsolescence. As summarized in Garcia-Macia, Hsieh, and Klenow (2019), a firm's technological obsolescence could originate from cannibalization by the firm's own new innovation (Christensen, 1997;Igami, 2017), by the new technological breakthroughs of a firm's industry rivals (BSV, KPSS), or from innovation from outside the boundary of the specific industry (e.g., AirBnB disrupting hotels; iPad and Kindle disrupting traditional printing copies). ...
... (Schmidt-Dengler, 2006) studies US hospitals' decisions to adopt magnetic resonance imaging (MRI). (Igami, 2017) studies how cannibalization, preemption, and incumbents' cost advantages shape firms' adoption of a new generation of hard disk drives. My paper adds to this literature by studying how regulation affects technology adoption. ...
... Backward induction can be applied in these settings due to a finite horizon assumption (Igami, 2017) or full adoption in finite time (Schmidt-Dengler, 2006). I instead model technology adoption as happening in an infinite horizon and assume that the game has a non-stationary part followed by a stationary part. ...
... Models of technology adoption must somehow contend with the fact that the demand for and costs of adopting a new technology vary over time. One way of dealing with the time-varying nature of demand and costs that appears in the literature is to assume a finite horizon and solve the game played by firms via backward induction; see, e.g., (Igami, 2017). That method raises the issue of assigning continuation values to different industry states in the final time period. ...
Article
In the first chapter of this dissertation, I study coverage requirements, a common regulation in the mobile telecommunications industry that intends to accelerate the roll-out of new mobile telecommunications technologies to disadvantaged areas. I argue that the regulation may engender entry deterrence effects that limit its efficacy and lead to technology introduction patterns that are not cost-efficient. To quantify the impact of coverage requirements on market structure and the speed and cost of technology roll-out, I develop and estimate a dynamic game of entry and technologyupgrade under regulation. I estimate the model using panel data on mobile technology availability at the municipality level in Brazil. In counterfactual simulations, I find that coverage requirements accelerate the introduction of 3G technology by just over one year, on average, and reduce firms’ profits by 24% relative to a scenario with no regulation. I find the entry deterrence effects to be small. Moreover, an alternative subsidization policy leads to a similar acceleration in the roll-out of 3G and substantially higher aggregate profits, likely increasing aggregate welfare relative to coverage requirements. In the second chapter, I investigate how the portfolio of products carried by retailers influences wholesale and retail prices. To this end, I develop and estimate a model of retailer pricing and retailer-manufacturer negotiations over wholesale prices. The estimation approach extends existing econometric tools for multi-product bargaining models to a setting with optimal downstream pricing. I use the estimated model to simulate the effects of counterfactual scenarios in which private label products or the products of a national manufacturer are excluded from retailers’ product portfolios. I find that wholesale prices do increase, but those effects are small. Eliminating private label products leads to an average increase in wholesale prices of only 0.10%; retail prices increase by only 0.04%. Eliminating a national manufacturer’s products leads to increases in wholesale prices between 0.003% and 0.677%; retail prices decrease by 0.027%-4.210% due to downstream pricing incentives.
... An incumbent can bring new products to foreign markets by cannibalizing extant products, whereas new entrants cannot. Incumbents are reluctant to innovate new products after entering a market successfully, whereas entrants offer new products in response to competition (Igami, 2017). 6 ...
... This is a demand-driven explanation for the affirmative relation between new exports and product innovation . 8 In contrast, cannibalization makes incumbent exporters reluctant to innovate products for foreign markets (Igami, 2017). ...
... See Scherer (1991),Cohen and Klepper (1996),Yin and Zuschovitch (1998),Baldwin and Sabourin (1999),Petsas and Giannikos (2005),Plehn-Dujowich (2009), andIgami (2017).3 In some cases, there may be IRS in undertaking product innovation because innovation development is a sunk cost. ...
Article
Purpose – Although models of innovation and exporting dominate recent studies of relations between innovation and access to foreign markets, relations between innovation and foreign direct investment (FDI) are less explored. This is especially true of relations between types of innovation and FDI. We fill that gap in the literature with empirical evidence that clarifies whether firms enter foreign markets through exports or FDI. Design/methodology – In order to assess the role of innovation in firms’ international engagement strategies, we develop research hypotheses and present new empirical evidence on firms’ choice of entry – exports and FDI – based on firm-level data. Findings – Our empirical results suggest that the impact of product innovation is more significant in transition from being a purely domestic firm to an exporter, while process innovation more significantly affect transition from being an exporter to a multinational enterprise. Our results also support ‘self-selection into FDI’ rather than ‘learning-by-performing FDI’ in the relationship between innovation and firms’ overseas expansion. Originality/value – Recent literature on the relationship between innovation and firms’ participation in foreign markets is dominated by models of innovation and export behavior. However, foreign direct investment by multinational enterprises may also be associated with firms’ innovative activities. We first analyze how product and process innovations influence firms’ choices to initiate exports or FDI.
... An unsatisfactory feature of a sequential-move game is that the assumption on the order of moves will generate an arti…cial early-mover advantage if the order is deterministic (e.g., Gowrisankaran 1995Gowrisankaran , 1999Igami 2017Igami , 2018. Instead, we propose a random-mover dynamic game in which the turn-to-move arrives stochastically. ...
... 13 Applications of dynamic games to mergers include Gowrisankaran's (1995Gowrisankaran's ( , 1999 Iskhakov, Rust, and Schjerning (2014, 2016) numerically study Bertrand duopoly with "leapfrogging"process innovations with a random-mover setup. Igami (2017Igami ( , 2018) studied the HDD industry as well, but the similarities end there. Our paper di¤ers from his in three major ways: questions, data, and models. ...
... Instead, we consider an alternating-move game in which the time interval is relatively short and only (up to) one …rm has an opportunity to make a dynamic discrete choice within a period. Gowrisankaran (1995Gowrisankaran ( , 1999 and Igami (2017Igami ( , 2018 are examples of such formulation with deterministic orders of moves, but researchers usually do not have theoretical or empirical reason to favor one speci…c order over the others. A deterministic order is particularly undesirable for analyzing endogenous mergers, because early-mover advantages will translate into stronger bargaining powers, tilting the playing …eld and equilibrium outcomes in favor of certain …rms. ...
Article
Full-text available
How far should an industry be allowed to consolidate when competition and innovation are endogenous? We develop a stochastically alternating-move game of dynamic oligopoly and estimate it using data from the hard disk drive industry, in which a dozen global players consolidated into only three in the last 20 years. We find plateau-shaped equilibrium relationships between competition and innovation, with heterogeneity across time and productivity. Our counterfactual simulations suggest the current rule-of-thumb policy, which stops mergers when three or fewer firms exist, strikes approximately the right balance between pro-competitive effects and value-destruction side effects in this dynamic welfare trade-off.
... A number of recent studies have examined preemption using the EP framework. Examples include Aguirregabiria and Ho (2012), Igami (2015), Igami and Yang (2016), Zheng (2016), and Hünermund et al. (2014). All studies employ some type of counterfactual of preemption. ...
... 12 For example, in Igami (2015), the incumbent is chosen as the preemptor and is given the first mover advantage, so that in equilibrium, the incumbent will indeed preempt entry. Similarly, Zheng (2016) also presets one retail chain, the Blue firm, as the preemptor and models the Blue firm as the first mover in each period of the game. ...
... Since the definitions of preemption imply the counterfactuals, I focus on discussing the counterfactuals here instead of the definitions themselves. As mentioned before, a number of papers have studied spatial preemption, including Igami (2015), Igami and Yang (2016), Zheng (2016), Yang (2015), and Hünermund et al. (2014). All the counterfactuals in these papers change other aspects of the equilibrium outcome while eliminating or reducing the effect of preemption. ...
... Note: The number of …rms counts only the major …rms with market shares exceeding one percent at some point of time. See Igami (2015aIgami ( , 2015b for the detailed analyses of product and process innovations during the …rst two decades of the sample period. ...
... When the non-proposer is a potential entrant, this "non-proposer"expected value is simpler than (15), ...
... because it does not earn pro…t, pay …xed cost, or become a merger target. When nature picks a potential entrant j as a "proposer," (15) and (17) become ...
... Second, many theoretical papers have studied preemption games, including Fudenberg and Tirole (1985), Riordan (1992), Quint and Einav (2005), and Argenziano and Schmidt-Dengler (2012). Few empirical papers structurally estimate such models, but Schmidt-Dengler (2006) and Igami (2015) do so in the specific context of technology adoption. By contrast, this paper aims to quantify the effect of preemption motives in a more general context of entry and market structure, and assesses their implications for the measurement of competition. ...
... , and under the standard normalization to setκ − = 0. 19 We follow the standard empirical models of entry and market structure (e.g., Seim 2006) and specify the average period profit per outlet as ...
... represents the effect of 19 We should carefully interpretα 1 ,κ + , andκ − because they are not identical to the primitives of the model (α 1 , κ + , κ − ) and are not separately identified from each other. Under our normalization,κ − = 0, Aguirregabiria and Suzuki (2014) showα 1 chain j's shop on chain i's shop), so that n −imt = ∑ j̸ =i n jmt becomes a sufficient statistic for rival-chain competition. ...
Article
Full-text available
We develop a dynamic entry model of multi-store oligopoly with heterogeneous markets, and estimate it using data on hamburger chains in Canada (1970–2005). Because more lucrative markets attract more entry, firms appear to favor the presence of more rivals. Thus unobserved heterogeneity across geographical markets creates an endogeneity problem and poses a methodological challenge in the estimation of dynamic games, which we address by combining the procedures proposed by Kasahara and Shimotsu (2009), Arcidiacono and Miller (2011), and Bajari, Benkard, and Levin (2007). The results suggest that the omission of unobserved market heterogeneity attenuates the estimates of competition, and the trade-off between cannibalization and preemption is an important factor behind the evolution of market structure.
... Furthermore, I will incorporate heterogeneous organizational types of firms (i.e., specialized versus conglomerate structures) as a sensitivity analysis (Table 4, right column). By contrast, the firm's size, age, and technological/product generations do not show clear patterns, and hence I will abstract from these aspects and refer the reader to Igami (2015) for further details on product innovation in the HDD industry. ...
... HDDs, but the empirical demand analysis incorporates more details to exploit additional variations in the HDD sales data, in which the unit of observation is the combination of generation, quality, year t, buyer category, and geographical regions (see Igami 2015 for the details of the HDD sales data). I denote the generation-quality pair by "product category" ...
... I do not model HDDs as durable goods because of fast obsolescence due to Kryder's Law, and also because the dynamics of re-purchasing cycles in the PC market is driven primarily by operating systems (e.g., Windows 95 and 98) or CPU chips (e.g., Intel's Pentium III), which I assume evolve exogenously to the HDD market. See Igami (2015) for details. 31 Tape recorders, optical disk drives, and flash memory. ...
Article
Full-text available
This paper uncovers a novel pattern of offshoring dynamics in a high-tech industry, and proposes a structural model to explain it. Specifically, the hard disk drive industry (1976–98) witnessed massive waves of entry, exit, and the relocation of manufacturing plants to low-cost countries, in which shakeouts occurred predominantly among home firms and almost all survivors were offshore firms. I build and estimate a dynamic offshoring game with entry/exit to measure the benefits and costs of offshoring, investigate the relationship between offshoring and market structure, and assess the impacts of hypothetical government interventions.
... We use firm dummies to control for firm differences in such decisions, year dummies to control for industry dynamics influence, and technical parameter dummies to control for niche and product differences. To account for product withdrawal decisions that can also be driven by the interaction between producer and temporal differences (Igami, 2017), we ran a robustness analysis with firm-year dummies, which we created by interacting firm dummies and year dummies. To allow variation, we ran this analysis for the firms that offered at least two products of which at least one had a product with an informative name in a given year, otherwise models did not converge. ...
... The two industries include different firms, utilize different technologies (optical vs. magnetic) and make different products (optical disk drives vs. hard drives). While management scholars have studied the HDD industry extensively (e.g., Barnett and McKendrick, 2004;Christensen and Rosenbloom, 1995;Igami, 2017;King and Tucci, 2002;Lerner, 1997), the ODD industry has received far less attention (Khessina, 2003(Khessina, , 2006Khessina and Carroll, 2008;Rosenkopf and Nerkar, 2001). ...
Article
Full-text available
How do customers discover new products? Recent research has found that a firm can facilitate the discovery and subsequent purchase of its product by giving it an advantageous name. However, no product exists in isolation, rather it competes for customer attention with other products both within and across product niches. We theorize that a product may benefit from the names of competitors’ products within its niche because certain product names can trigger a positive spillover effect. Specifically, product viability should increase with the proliferation of products with informative names in a focal niche because informative names attract attention to the niche, and consequently benefit all its products, regardless of whether they have informative names or not. This beneficial influence should be especially strong when a niche is new. Additionally, a product's market fate may depend not only on the prevalent naming practices in its niche, but also on naming practices in competing niches. We find support for our theorizing in event‐history analyses of all CD‐drive products shipped in the worldwide optical disk drive industry, 1983–1999. Ultimately, our findings suggest that in high‐velocity markets, to facilitate product discovery by customers, firms should enter niches populated by products with informative names.
... While the assumptions of a finite horizon and sequential moves are quite strong, they provide three crucial benefits: 1) the dynamic equilibrium is unique, 2) solving the dynamic game does not involve value function iterations and suffers no convergence problem (Egesdal, Lai and Su (2015)), 3) the finite horizon assumption also helps to capture the non-stationarity in data (Igami (2015)). I explore the robustness of both assumptions in the appendix of the paper version of the chapter. ...
... The model in this paper endo-genizes both the dynamic investment decisions as well as the pricing of intermediate goods. I contribute to the growing literature that analyzes innovation with dynamic oligopoly models(Ericson and Pakes (1995),Goettler and Gordon (2011), Borkovsky (2012),Igami (2015) and others) by modeling the complementarity of innovations between the upstream and downstream firms. The static model of product competition is built on the empirical bilateral bargaining framework developed inHorn and Wolinsky (1988). ...
Thesis
This dissertation develops new methods to analyze firm behaviors and provides estimates and predictions that inform antitrust and innovation policies. The first chapter shows that horizontal merger policies may be tougher when taking into account a merger's effects on the composition of product offerings in addition to the merger's price effects. The second chapter quantifies and decomposes the effects of vertical integration. I show that the investment coordination effects are pro-innovation and dominate the price effects. The results suggest that vertical integration policies should fully consider the potentially positive dynamic implications of a vertical merger. Both chapters are empirical studies in the context of the US smartphone industry. The third chapter develops identification strategies for a general class of matching games and estimates the formation of investment relationships between venture capitalists and biomedical startups. The estimates show that unobservables may be as important as observables in determining which VC invests in which startup firm, and understanding these unobserved factors is important for innovation policies.
... It represents the value of being in state s after (conditional on) choosing a particular action a. Rust (1987) called this expression "alternative-speci…c value function" in his NFXP algorithm to estimate the parameter of u by solving the DP exactly; 27 Watkins (1989) used the term "action value" (and notation Q) when he introduced Q-learning to solve the DP approximately. Hence, it is known as "Q-factor,""Q-value,"and "state-action value"in the RL/ADP literature. ...
... An alternating-move game with a …nite horizon has a unique equilibrium when i.i.d. utility shock is introduced and breaks the tie between multiple discrete alternatives.Igami (2017Igami ( , 2018 demonstrates how Rust's NFXP naturally extends to such cases with a deterministic order of moves; Igami and Uetake (2019) do the same with a stochastic order of moves. ...
Article
Full-text available
This article clarifies the connections between certain algorithms to develop artificial intelligence (AI) and the econometrics of dynamic structural models, with concrete examples of three “game AIs.” Chess-playing Deep Blue is a calibrated value function, whereas shogi-playing Bonanza is an estimated value function via Rust’s (1987) nested fixed-point method. AlphaGo’s “supervised-learning policy network” is a deep-neural-network implementation of the conditional-choice-probability estimation reminiscent of Hotz and Miller (1993); the construction of its “reinforcement-learning value network” is analogous to Hotz, Miller, Sanders, and Smith’s (1994) conditional choice simulation. I then explain the similarities and differences between AI-related methods and structural estimation more generally, and suggest areas of potential cross-fertilization.
... Schumpeter (1942) labeled this type of innovation 'creative destruction' because of the tension between current and new products/services. He saw this cycle as essential for a company's survival and growth and the definition of a successful innovation became synonymous with economic improvement of the company (Atuahene- Gima, 2005;Damanpour, 1991;Igami, 2017;Nelson & Winter, 1982). Thus, how companies achieved economic gains by balancing existing (exploiting) versus new (exploration) innovations have been a long-standing seminal topic (Baregheh et al., 2009;Christensen, 1997;Schumpeter, 1942). ...
... et a very different value proposition. These innovations are considered destructive because they displace current products/services/processes and at times, incumbent companies (Christensen, 1997;Christensen & Overdorf, 2000). Subsequent studies question how often incumbent companies pursue innovations of a creative destruction nature (Glazer, 2007;Igami. 2017;Markides, 2006). ...
Chapter
To extend our knowledge on how companies enact humanistic management we examined the nature and purpose of sustainable development innovation which companies reported through published sources. Using a positivist qualitative methodology based on a modified inductive categorical documentary analysis, we gathered data from 33 companies at two points in time (2009 and 2014) to identify and classify how companies integrate care for society and the environment into actions. Findings show that companies almost exclusively reported continuous improvements to exploit existing products, services, and/or processes rather than radical changes through bringing new products, services, and/or processes to the market. We categorized sustainable development innovation along two dimensions: technological versus social and incremental versus radical innovation. This framework may help practitioners and academics to better understand and foster rationales and schemas which promote corporate humanistic management.
... A glance at figure 3 shows that the main flows of HDDs are from the countries that have been known primarily as manufacturers or assemblers of HDDs. The lower production costs in South East Asian countries, such as Malaysia, Thailand, and China, have encouraged the U.S. and Japanese manufacturers of HDDs to outsource production activities (Igami 2017). Hence, we may come up with this conclusion that the majority of traded HDDs reported by the UN Comtrade are new. ...
... However, the annual growing rate of buying electronics is estimated to be about 20% (Qiang et al. 2014), although this rate may vary for different countries and types of consumer electronics. There might be some data available on annual growth rate of HDDs (Igami 2017;Sprecher et al. 2014), but we have decided to use the annual growth rate of consumer electronics because of three main reasons: (1) All types of HDDs may not be necessarily included in the data; (2) many consumer electronics may contain HDDs that have not been recorded; and (3) the consumer electronics are likely to be disposed of together with their internal parts. ...
Article
Full-text available
The remaining value within end‐of‐use/life hard disk drives (EoU/L HDDs) is often not optimally recovered. The improper collection and recovery of HDDs contribute not only to rising environmental and social concerns worldwide, but also to the transformation of the economy and a significant loss of value. Currently, the most preferred treatment option for used hard drives is to recover the metals with the highest recycling effectiveness, such as steel and aluminum, via a shredding‐based recycling process that results in both value and material leakages. The complexity of retrieving the remaining values within EoU/L HDDs demands a larger view of the global supply of HDDs available for recovery. The aim of this paper is to first identify the geographical patterns of transboundary global shipments of new and used HDDs between developing and developed regions, and then capture and quantify the value leakage by bringing several unique perspectives. Two analyses have been conducted. First, the loss of value due to the insufficient recovery of neodymium (Nd) at the global level is quantified. Second, the value leakage as a result of the delay on on‐time reuse of HDDs is captured. Furthermore, the central challenges toward proper recovery of HDDs, where consumer electronic industry can make significant contributions, have been identified. HDDs are well positioned to contribute important insights to the recovery of other electronic devices, so the findings from HDDs can be adopted for other types of electronics.
... We use firm dummies to control for firm differences in such decisions, year dummies to control for industry dynamics influence, and technical parameter dummies to control for niche and product differences. To account for product withdrawal decisions that can also be driven by the interaction between producer and temporal differences (Igami, 2017), we ran a robustness analysis with firm-year dummies, which we created by interacting firm dummies and year dummies. To allow variation, we ran this analysis for the firms that offered at least two products of which at least one had a product with an informative name in a given year, otherwise models did not converge. ...
... The two industries include different firms, utilize different technologies (optical vs. magnetic) and make different products (optical disk drives vs. hard drives). While management scholars have studied the HDD industry extensively (e.g., Barnett and McKendrick, 2004;Christensen and Rosenbloom, 1995;Igami, 2017;King and Tucci, 2002;Lerner, 1997), the ODD industry has received far less attention (Khessina, 2003(Khessina, , 2006Khessina and Carroll, 2008;Rosenkopf and Nerkar, 2001). ...
Article
Organization and strategy scholars have long recognized that the performance of firms and their products depends on how well companies mitigate competition. We develop a theory explaining how organizations can engage customers by means of their products’ names to reduce the harmful effect of competition. We suggest that producer engagement through product naming has both immediate and ecological effects on product survival in a niche and theorize about three underlying processes. First, producers may increase survival chances of their products by giving them names that customers find helpful for initial product categorization. Second, the viability of products in a focal niche should increase with the density of producers that engage consumers through product naming. Finally, product market fate may depend not only on the prevalent naming practices in the focal niche, but also on naming practices in competing niches. Thus, our theory suggests that (1) producer engagement through product naming may have a profound effect on product demography in market niches, and (2) producer engagement does not happen in vacuum and its ultimate effect depends on actions of other firms both in a focal and in competing niches. We find support to our theorizing in the event-history analysis of all CD-drive products shipped by all producers in the worldwide optical disk drive (ODD) industry, 1983-1999.
... Two approaches characterize instead 'employee learning and mobility models'. These approaches share some common assumptions: R&D employees can learn and exploit the know-how of the firm they work for (Franco and Mitchell, 2008;Sakakibara and Balasubramanian, 2020) and know-how transfer increases their potential for creating a spinout (Igami, 2017;Babina and Howell, 2020). ...
... For instance, the prediction that 'better', thus more knowledgeable, parents generate more spinoffs has found confirmation in the case of lasers (Klepper and Sleeper 2005). The prediction that spinouts survival is increasing in their technological know-how has been confirmed in the case of the Hard Disk Drive industry (Agarwal et al. 2004;Igami, 2017). Other predictions are contrasting and remain to a certain extent unexplored. ...
Article
We present a model of spinout creation and survival to explain how the initial product market strategies of spinouts may differ with respect to their parent and how this strategy affects their success. We test the model using detailed information on all the entrants in three markets in the Local Area Networking (LAN) industry during the 1990s. Our findings are consistent with the implications of the theoretical model. Concerning spin-out generation, the parent firm technological know-how plays an important role for the initial product strategy. In particular, we find that spinouts tend to imitate ‘average’ parents and ‘keep away from the extremes’ (i.e. parents that are too good or too bad in terms of technological know-how). Concerning spin-out survival, we find that better spinouts survive longer and that there is no direct effect of a parent’s know-how on spin-out’s survival. Finally, too much diversification with respect to the parent firm can be detrimental for spin-out success.
... Disruptive innovation is when an invention displaces an old technology by providing better advantages (Igami, 2017). Such innovation could result from a progressive shift or the emergence of an entirely new concept, approach, or innovation (Edema et al., 2022;Christensen, Raynor & McDonald, 2015). ...
Article
Full-text available
(Purpose) The COVID-19 pandemic caused unprecedented disruptions to global education, challenging teaching, research, and community engagement. This study examines these impacts on accounting education in private universities in Nigeria, focusing on preparedness for future crises. (Design/Methodology) A cross-sectional survey of 209 accounting professionals from Nigerian private universities explored disruptions in teaching, research, and community development. Descriptive and inferential statistics were used to evaluate their direct and indirect effects on performance indicators. (Findings) Community development showed the highest variability, while research exhibited the least. Teaching disruptions significantly increased ICT investment, demonstrating its role during the pandemic. However, no significant indirect pathways were found between ICT investment and disruptions in teaching or research, nor between ICT and performance indicators. (Originality/Value) This study uniquely investigates the dual impacts of pandemic-induced disruptions on accounting education in a developing economy on teaching, research, and community development. It shows the role of teaching in driving technological adaptation and provides critical insights into educational improvement in the future. (Conclusion) The COVID-19 pandemic influenced the tripartite missions of accounting education in Nigeria’s private universities, suggesting lessons for innovation and strategic preparedness against future disruptions; these findings contribute to global discussions on educational crisis management and resilience planning.
... Second, regulatory restrictions on banks can create opportunities for fintech-based firms as they make banking services more costly (Buchak et al 2018), or less appealing (Buchak et al 2021) for consumers, Finally, several authors have argued that banks, stuck with legacy technology and data infrastructure, are bound to lag new entrants in the introduction of innovation. The broader literature provides some support for this argument; startups may be more creative in their approach to innovation (Kolev, Haughey, Murray and Stern 2022), incumbents may have disincentives to innovate for fear of cannibalizing existing revenue streams (Christensen 1997, Igami 2017). In addition, it is difficult to adapt business processes to exploiting innovations (Brynjolfsson and Hitt 2000). ...
... See Aguirregabiria and Mira (2010) for a survey of methodologies. Recently, Igami (2017) and Igami and Uetake (2020) extend the Nested Fixed Point algorithm to dynamic oligopoly games with sequential or stochastically alternating moves. and intertemporal preferences. ...
Preprint
This paper proposes an empirical model of dynamic discrete choice to allow for non-separable time preferences, generalizing the well-known Rust (1987) model. Under weak conditions, we show the existence of value functions and hence well-defined optimal choices. We construct a contraction mapping of the value function and propose an estimation method similar to Rust's nested fixed point algorithm. Finally, we apply the framework to the bus engine replacement data. We improve the fit of the data with our general model and reject the null hypothesis that Harold Zuercher has separable time preferences. Misspecifying an agent's preference as time-separable when it is not leads to biased inferences about structure parameters (such as the agent's risk attitudes) and misleading policy recommendations.
... The relationship between patent litigation, patent licensing, and subsequent industrialization has become a subject of research interest in recent years. Although patent litigation can be settled out of court, the huge patent licensing fee also hinders the technological innovation of enterprises [20]. Siebert and Graevenitz showed that patent licensing is helpful for enterprises to solve the problem of patent blocking [9]. ...
... Nonetheless, few empirical studies have attempted to estimate these forces in a dynamic equilibrium model. Some notable exceptions include Benkard (2004), Goettler andGordon (2011), andIgami (2017). These papers focus on industries that are dominated by a few large firms and where strategic interactions in the product market play a central role in firms' innovation decisions. ...
... This study is built on the disruptive innovation theory. Disruptive innovation is described as a process by which a new innovation dislodges an existing innovation by offering superior benefits (Igami, 2017). Such innovation could result from a gradual transformation or introduction of a radically new idea, technique or technology (Christensen, Raynor & McDonald, 2015). ...
Article
Full-text available
The COVID-19 pandemic has caused significant disruptions in workplaces. The traditional tasks and techniques are giving way for emerging, creative and innovative ones with consequences for the workforce and the organization. How these alterations are challenging the management of the workforce and how organizations are dealing with the emerging issues is the focus of this research. The study is a theoretical review aimed at shedding light on the problems of coronavirus pandemic on the management of workforce. Data were sourced from learned journals and analysed qualitatively. Despite ambiguity surrounding performance of organizations, post-COVID-19 era is projected to witness increased adoption of disruptive technologies and corresponding acceptance of digital work models. The difficulties being experienced now in managing workforce will likely continue unabated until organizations fully integrate digitization of work processes. The workforce will be streamlined and additional set of skills will be required to manage them. Therefore, organizations are expected to invest in technology of the future and reskill themselves for the tasks of managing future workforce.
... This can make them slow or unable to adopt those technologies (Schumpeter (1942), Arrow (1974)). That there are factors that appear to hold established firms back in the adoption of significant or radical innovations has received empirical support (notably, Henderson (1993) and Igami (2017)). ...
... Meanwhile, Igami (2017) shows that the innovation gap among hard disk drive manufacturers, between incumbents and entrants, is due to the dulled incentives of market leaders. Igami and Uetake (2020) study the same industry using a dynamic structural model and find that the incentive to innovate greatly increases when going from one to three or more competitors. ...
Thesis
This thesis primarily studies the topic of the impact of international trade on innovation, with a specific focus on developing countries. It also covers trade spillovers and gender inequality. Chapter 1 provides a survey of the recent theoretical and empirical economic literature discussing the effect of import competition on innovation-related outcomes. This chapter is divided into four main sections. First, a background on patents and trademarks as innovation indicators is presented. Second, the theoretical mechanisms behind the impact of import competition on innovation is discussed. Third, a literature review of the recent empirical studies on trade and innovation is summarized. Lastly, an empirical analysis is conducted to examine the impact of Chinese import competition on innovation, measured by patenting activity, in developing countries.Chapter 2 is an empirical exercise examining the impact of Chinese import competition on innovation, using a new measure which is trademarking. The study uses a panel dataset from 1995 to 2018 across over 100 developing countries. The empirical model corrects for endogeneity using a shift-share instrumental variable approach of Chinese imports in other developing countries. The main estimations find an overall decreasing effect of import competition on trademarking activity. The instrumental variable results are particularly robust for trademark applications, but not registrations. This provides suggestive evidence that import competition negatively affects new local product or service innovations in developing countries.Lastly, Chapter 3 tackles the question regarding the impact of international trade spillovers on gender equality norms. The majority of existing studies document the effect of liberalization episodes on gender inequality. That of trade actors, though, has been largely ignored. Is gender inequality within a country affected by its trading partners? This question was examined in this paper by adopting a spatial model of trade across 123 countries from 1997-2013. We use the gender inequality index to capture global trends on gender within a country. We then disaggregated trade into technology-level products in order to identify a technology channel. Results show that higher gender equality standards abroad spills over via imports of medium, high-tech and mineral products into higher domestic gender equality standards. Such results suggest that a technological revolution affects firms, households and governments in impacting female empowerment.
... By contrast, Igami and Uetake (2020) explicitly models the end of an industry by specifying a demand shifter that linearly decreases with time. 47 An alternative assumption would be the expectation of continued demand growth after 1998, but that ...
Article
Full-text available
Do mergers help or hinder collusion? This paper studies the stability of the vitamin cartels in the 1990s and presents a repeated-games approach to quantify “coordinated effects” of a merger. We use data and direct evidence from American courts and European agencies to show the collusive incentive of the short-lived vitamin C cartel was likely to be negative when it actually collapsed in 1995, whereas the incentives of the long-lived cartels (vitamins A and E, and beta carotene) were unambiguously positive until the prosecution in 1999. Simulations suggest some mergers could have prolonged the vitamin C cartel, but others could have further destabilized it, because both the direction and magnitude of coordinated effects depend not only on the number of firms but also on their cost asymmetry.
... Some of the most influential computational theory papers include Gowrisankaran (1999), Miao (2005), Asplund & Nocke (2006) and Besanko et al. (2010). On the other hand, thanks to identification results from, among many others, Hotz & Miller (1993), Aguirregabiria & Mira (2002) and Bajari et al. (2007), it has spurred a large empirical literature that includes for example Goettler & Gordon (2011), Gowrisankaran & Rysman (2012), Ryan (2012) and Igami (2017). Second, Ericson & Pakes (1995) is the model used to test most of the different approximation methods described in this paper. ...
Preprint
Full-text available
Dynamic stochastic games notoriously suffer from a curse of dimensionality that makes computing the Markov Perfect Equilibrium of large games infeasible. This article compares the existing approximation methods and alternative equilibrium concepts that have been proposed in the literature to overcome this problem. No method clearly dominates the others but some are dominated in all dimensions. In general, alternative equilibrium concepts outperform sampling-based approximation methods. I propose a new game structure, games with random order, in which players move sequentially and the order of play is unknown. The Markov Perfect equilibrium of this game consistently outperforms all existing approximation methods in terms of approximation accuracy while still being extremely efficient in terms of computational time.
... In empirical industrial organization (IO), some of the most commonly used structural models of oligopoly competition assume complete information, perfect certainty, and Nash equilibrium. For instance, this is the case in models of price competition with di¤erentiated product (Berry, Levinsohn, and Pakes, 1995;Berry and Haile, 2014), and in empirical games of market entry (Bresnahan and Reiss, 1991;Ciliberto and Tamer, 2009). 1 Though there is a substantial literature on structural models of incomplete information in empirical IO, it is mostly concentrated in auctions (Guerre, Perrigne, and Vuong, 2000;Athey and Haile, 2002), and in discrete choice games, both static (Seim, 2006;Sweeting, 2009;Bajari et al., 2010), and dynamic (Aguirregabiria and Mira, 2007;Igami, 2017). Empirical applications to models of quantity or price competition are not so common, though Armantier and Richard (2003) and Aryal and Zincenko (2019) are good exceptions. ...
Article
Full-text available
Firms make decisions under uncertainty and differ in their ability to collect and process information. As a result, in changing environments, firms have heterogeneous beliefs on the behaviour of other firms. This heterogeneity in beliefs can have important implications on market outcomes, efficiency and welfare. This paper studies the identification of firms’ beliefs using their observed actions—a revealed preference and beliefs approach . I consider a general structural model of market competition where firms have incomplete information and their beliefs and profits are nonparametric functions of decisions and state variables. Beliefs may be out of equilibrium. The framework applies both to continuous and discrete choice games and includes as particular cases models of competition in prices or quantities, auction models, entry games and dynamic games of investment decisions. I focus on identification results that exploit an exclusion restriction that naturally appears in models of competition: an observable variable that affects a firm's cost (or revenue) but does not have a direct effect on other firms’ profits. I present identification results under three scenarios—common in empirical industrial organization—on the data available to the researcher.
... Roberts (1990, 1995) [16,17] provide a theory of complementarity between different activities of the firm, directly applied to the complementarity of firms' innovation activities. With the recent development of structural estimation, research has been shifting from theoretical to empirical studies; see, for example, Igami (2017) [12]. The microeconomic foundation of economic growth has become more complicated than what has been assumed previously; for example, Dixit and Stiglitz's monopolistic competition. ...
Preprint
In this study, we consider research and development investment by the government. Our study is motivated by the bias in the budget allocation owing to the competitive funding system. In our model, each researcher presents research plans and expenses, and the government selects a research plan in two periods---before and after the government knows its favorite plan---and spends funds on the adopted program in each period. We demonstrate that, in a subgame perfect equilibrium, the government adopts equally as many active plans as possible. In an equilibrium, the selected plans are distributed proportionally. Thus, the investment in research projects is symmetric and unbiased. Our results imply that equally widespread expenditure across all research fields is better than the selection of and concentration in some specific fields.
... In applied work the timing of decisions within periods is typically not observable in the data, thus giving the researcher considerable latitude in specifying a protocol of moves. As a consequence, a number of recent papers have experimented with alternatives to the standard assumption of simultaneous moves (Igami 2017(Igami , 2018Iskhakov 2017). ...
Article
Full-text available
We reformulate the quality ladder model of Pakes and McGuire, Rand Journal of Economics, 25(4), 555–589 (1994) as a dynamic stochastic game with random moves in which each period one firm is picked at random to make an investment decision. Contrasting this model to the standard version with simultaneous moves illustrates the computational advantages of random moves. In particular, the quality ladder model with random moves avoids the curse of dimensionality in computing firms’ expectations over all possible future states and is therefore orders of magnitude faster to solve than its counterpart with simultaneous moves when there are more than just a few firms. Perhaps unexpectedly, the equilibria of the quality ladder model with random moves are practically indistinguishable from those of the model with simultaneous moves. © 2018 Springer Science+Business Media, LLC, part of Springer Nature
... The estimated model shows that removing competition from AMD implies an increase in consumer surplus but lower innovation. Igami (2017) estimates a dynamic oligopoly model of product innovation in the hard disk industry and studies the large gap between the propensities to innovate of incumbents and new entrants (57% gap). The model includes cannibalization between existing and new products, preemptive motives and differences in costs as potential factors that can explain differential propensities to innovate by incumbents and new entrants. ...
Article
Full-text available
We review important developments in empirical industrial organization (IO) over the last three decades. The paper is organized around six topics: collusion, demand, productivity, industry dynamics, interfirm contracts and auctions. We present models that are workhorses in empirical IO and describe applications. For each topic, we discuss at least one empirical application using Canadian data. Modèles empiriques de firmes et d’industries. On passe en revue les développements importants en organisation industrielle (OI) empirique au cours des trois dernières décennies. Le texte est organisé autour de six enjeux: collusion, demande, productivité, dynamique industrielle, contrats inter-firmes, et enchères. On présente des modèles qui sont des chevaux de bataille dans l’OI empirique, et on décrit des applications. Pour chaque enjeu, on discute d’au moins une application empirique utilisant des données canadiennes.
... Igami (2017Igami ( , 2018, as well asIgami and Uetake (2017), demonstrate how Rust's NFXP naturally extends to games with alternating moves (see the first two papers for deterministic orders of moves; see the third paper for a stochastic order of moves). Shogi, chess, andGo are games with a deterministic order of alternating moves. ...
Article
Artificial intelligence (AI) has achieved superhuman performance in a growing number of tasks, including the classical games of chess, shogi, and Go, but understanding and explaining AI remain challenging. This paper studies the machine-learning algorithms for developing the game AIs, and provides their structural interpretations. Specifically, chess-playing Deep Blue is a calibrated value function, whereas shogi-playing Bonanza represents an estimated value function via Rust's (1987) nested fixed-point method. AlphaGo's "supervised-learning policy network" is a deep neural network (DNN) version of Hotz and Miller's (1993) conditional choice probability estimates; its "reinforcement-learning value network" is equivalent to Hotz, Miller, Sanders, and Smith's (1994) simulation method for estimating the value function. Their performances suggest DNNs are a useful functional form when the state space is large and data are sparse. Explicitly incorporating strategic interactions and unobserved heterogeneity in the data-generating process would further improve AIs' explicability.
... The well-known challenges in these dynamic estimations are the large number of agents, choices and states, and the existence of multiple equilibria, which means an important computational burden. Igami (2013) overcomes these challenges by modeling a small number of state spaces and choice sets to estimates a dynamic model via maximum likelihood using the nested fixed-point algorithm of Rust (1987). Some of these topics have already been addresses using posted prices but without discrete choice models. ...
Article
Full-text available
This paper estimates the demand for flights in an international air travel market using a unique dataset with detailed information not only on flight choices but also on contemporaneous prices and characteristics of all the alternative non-booked flights. The estimation strategy employs a simple discrete choice random utility model that we use to analyze how choices and its response to prices depend on the departing airport, the identity of the carrier, and the departure date and time. The results show that a 10% increase in prices in a 100-seat aircraft throughout a 100-period selling season decreases quantity demanded by 7.7 seats. We also find that the quantity demanded is more responsive to prices for Delta and American, during morning and evening flights and that the response to prices changes significantly over different departure dates.
... As highlighted in the introduction, our main results concern the case in which competition is su¢ ciently stronger than turnover ->^ (N ); so that fear of preemption dominates intertemporal spillovers, and weak patents are optimal as they lengthen suboptimal equilibrium patent times (i.e., because T < T and dT =d < 0): Some evidence of excessive patenting and of low value patents can be found in numerous anecdotes about trivial patents, and in the staggeringly low rate of patents that are commercialized. 29 Furthermore, we later illustrate in Section 5 how our results can be used to assess quantitatively the optimality of weak patents by means of a numerical exercise. ...
Article
Full-text available
This article studies optimal patents with respect to the timing of innovation disclosure. In a simple model, we identify forces that lead firms to either suboptimally patent too early or too late in equilibrium, and we determine conditions so that stronger patents induce earlier or later equilibrium disclosure. Then, by solving an infinite multistage patent game with a more explicit structure, we describe innovation growth, and derive detailed predictions that can be used for policy experiments. As an application, we calibrate our multistage game using summary statistics from the seeds breeding industry. We find that weaker patent rights may result in welfare gains of 46% relative to the status quo. The gains are achieved because weaker patents reduce competition, thus leading firms to postpone patenting.
... This approach is used in applications such as those presented by Snider (2009), Collard-Wexler (Forthcoming), Dunne et al. (2011), Varela (2011, Ellickson et al. (2012), Lin (2012), Aguirregabiria and Mira (2007), or Suzuki (2013), among others. In other papers, such as Pakes et al. (2007), Ryan (2012), Sweeting (2011) or Igami (2012, the normalization involves making the fixed cost equal to zero. 5 Using this non-identification result as a starting point, the purpose of this paper is to study the implications of the 'normalization' approach on the interpretation of the estimated structural functions and, most importantly, on the identification of the effects of comparative static exercises or counterfactual experiments using the estimated model. This issue is important because many empirical questions on market competition, as well as on the evaluation of the effects of public policies in oligopoly industries, involve examining counterfactual changes in some of these structural functions (see Ryan 2012, Dunne et al. 2011, Lin 2012, and Varela 2011. ...
Article
Full-text available
This paper deals with a fundamental identification problem in the structural estimation of dynamic oligopoly models of market entry and exit. Using the standard datasets in existing empirical applications, there are three key components of a firm's profit function that are not separately identified: the fixed cost of an incumbent firm, the entry cost of a new entrant, and the scrap-value of an exiting firm. We study the implications of this result on the power of this class of models to identify the effects of comparative static exercises or public policies involving changes in these structural functions. First, we derive a closed-form relationship between the three unknown structural functions and two functions that are identified from the data. We use this relationship to provide the correct interpretation of the estimated objects that are obtained under the 'normalization assumptions' considered in most applications. Second, we characterize a class of counterfactual experiments that are identified using the estimated model, despite the non-separate identification of the three primitives. Third, we show that there is a general class of counterfactual experiments of economic relevance that are not identified. We present numerical examples that illustrate how ignoring the non-identification of these counterfactuals (i.e., making a 'normalization assumption' on some of the three primitives) generates sizable biases that can modify even the sign of the estimated effects. Finally, we discuss possible solutions to deal with these identification problems.
Article
Our study explores the intricate relationships between technology evolution and innovation, informing the mutual forbearance hypothesis. We propose that in high technology industries, the hypothesis is governed by the stage of technology evolution, and is not deterministic. We show how mutual forbearance increases progressively as technology evolves from the emergence phase to the maturity phase. Then based on a convenient sample of firms from the disk drive industry between 1991-1995, an era of ferment prior to the establishment of the dominant design, we examine our hypothesis and find support for mutual forbearance in the mature phase of the technology lifecycle.
Article
In 1993, four years prior to the publication of Clayton Christensen's highly influential book, The Innovator's Dilemma , the Business History Review published an article by Christensen titled “The Rigid Disk Drive Industry: A History of Commercial and Technological Turbulence.” The article relates the theory of disruptive innovation to Alfred D. Chandler's work on large vertically integrated enterprises. It was published during a pivotal era of scholarship on innovation, management practice, and industry evolution, much of which used the history of firms, industries, and technologies to build theory. I survey the impact and critiques of Christensen's research agenda, highlighting how it illustrates where the boundaries associated with the “lessons of history” should be drawn.
Article
Full-text available
Firms often change their operating policy to meet a short-term financial reporting target. Accounting researchers call this opportunistic action real earnings management (REM). They measure REM by the difference between a firm’s costs and those reported by its industry peers. Firms that pursue distinct competitive strategies also display different cost patterns than peers. However, the models that measure REM do not control for differences in competitive strategy. Hence a researcher can misinterpret a cost difference that stems from a firm’s competitive strategy as REM. The researcher would also find a spurious correlation between earnings management and a firm characteristic that varies with competitive strategy. A cause or effect relationship with earnings management could be wrongfully inferred. I suggest improvements in measurement models to avoid misspecification.
Article
This article examines the effects of bank privatization on the number of bank branches operating in small isolated markets in Brazil. We estimate a dynamic game played between Brazilian public and private banks. We find private banks compete with each other as expected. We also find public banks generate positive spillovers for private banks. Our counterfactual study shows that privatization substantially reduces the number of banks. The government can mitigate the effects of privatization by providing subsidies to private banks. Our model predicts subsidy policies that reduce operating costs are more cost‐effective than entry costs for isolated markets in Brazil.
Article
Full-text available
We review important developments in Empirical Industrial Organization (IO) over the last three decades. The paper is organized around six topics: collusion, demand, productivity, industry dynamics, interfirm contracts, and auctions. We present models that are workhorses in empirical IO, and describe applications. For each topic, we discuss at least one empirical application using Canadian data.
Article
Prior studies show that the risk level of each new cohort of listed firms is higher than its predecessor's. We find that these risk differences are persistent and investigate two potential explanations: (1) Each cohort adopts and retains operating innovations that are associated with higher risks, and (2) increasing numbers of younger and less-experienced firms are represented in each new cohort. Our results support the first explanation. Each new cohort uses riskier production technologies and operates in more competitive product markets than its predecessor.
Article
We assess the usefulness of patent statistics as an indicator of innovation using a direct measure of innovation in the hard disk industry (1976–98). Three findings emerge: (1) patents “predict” innovations better than a random guess, and a simple refinement makes them more useful; (2) conditional on innovating, conglomerates and larger firms patent more than specialized startups and smaller firms; and (3) patent reforms seem to make the patent-innovation relationship nonstationary. These results suggest researchers to use caution when comparing patents of different types of firms and across years because ill-informed R&D policy interventions may entail detrimental impacts on economic growth and welfare.
Article
This paper develops a dynamic model of retail competition and uses it to study the impact of the expansion of a new national competitor on the structure of urban markets. In order to accommodate substantial heterogeneity (both observed and unobserved) across agents and markets, the paper first develops a general framework for estimating and solving dynamic discrete choice models in continuous time that is computationally light and readily applicable to dynamic games. In the proposed framework, players face a standard dynamic discrete choice problem at decision times that occur stochastically. The resulting stochastic-sequential structure naturally admits the use of CCP methods for estimation and makes it possible to compute counterfactual simulations for relatively high-dimensional games. The model and method are applied to the retail grocery industry, into which Wal-Mart began rapidly expanding in the early 1990s, eventually attaining a dominant position. We find that Wal-Mart’s expansion into groceries came mostly at the expense of the large incumbent supermarket chains, rather than the single-store outlets that bore the brunt of its earlier conquest of the broader general merchandise sector. Instead, we find that independent grocers actually thrive when Wal-Mart enters, leading to an overall reduction in market concentration. These competitive effects are strongest in larger markets and those into which Wal-Mart expanded most rapidly, suggesting a diminishing role of scale and a greater emphasis on differentiation in this previously mature industry.
Article
Full-text available
The effect of competition on innovation incentives has been a controversial subject in economics since Joseph Schumpeter advanced the theory that competitive markets are not necessarily the most effective organizations to promote innovation. The incentive to innovate is the difference in profit that a firm can earn if it invests in research and development compared to what it would earn if it did not invest. The concept is straightforward, yet differences in market structure, the characteristics of innovations, and the dynamics of discovery lead to seemingly endless variations in the theoretical relationship between competition and expenditures on research and development or the outputs of research and development (R&D). This paper surveys the economic theory of innovation, focusing on market structure and its relationship to competition, the distinction between product and process innovations, and the role of exclusive and nonexclusive rights to innovation, and draws conclusions from the different models. Exclusive rights generally lead to greater innovation incentives in more competitive markets, while nonexclusive rights generally lead to the opposite conclusion, although there are important exceptions. The paper reviews the large literature on empirical studies of innovation and finds some support for the predictions of the theory.
Chapter
Full-text available
Technological diffusion is defined widely as the process by which the market for a new technology changes over time and from which production and usage patterns of new products and production processes result. This chapter looks at both the demand and supply sides of this process at differing levels of aggregation, from the worldwide to the interfirm or household level, via consideration of intensive and extensive margins. Realized diffusion patterns are discussed and theoretical underpinnings of the diffusion process explored. Econometric models, data availability, and estimation are also considered although there is little attempt to be comprehensive re the latter given existing surveys. Diffusion policy is also addressed and some comments on future research directions offered.
Article
Full-text available
Article
Full-text available
This paper focuses on patterns of technological change and on the impact of technological breakthroughs on environmental conditions. Using data from the minicomputer, cement, and airline industries from their births through 1980, we demonstrate that technology evolves through periods of incremental change punctuated by technological break-throughs that either enhance or destroy the competence of firms in an industry. These breakthroughs, or technological discontinuities, significantly increase both environmental uncertainty and munificence. The study shows that while competence-destroying discontinuities are initiated by new firms and are associated with increased environmental turbulence, competence-enhancing discontinuities are initiated by existing firms and are associated with decreased environmental turbulence. These effects decrease over successive discontinuities. Those firms that initiate major technological changes grow more rapidly than other firms.
Article
Full-text available
Increasingly, technological innovation creates markets for new products and services. To survive, firms must respond to these new markets. How do firms develop the capabilities necessary to succeed in such changing conditions? Some suggest that experience with previous entry builds such capabilities. Others suggest that capabilities arise from experience producing and selling to existing markets. The role of managers is also debated. Some argue that experience with existing markets causes managers to miss entry opportunities. Others argue that managers enter new markets when their firm possesses the experience needed to compete effectively. In this paper, we explore these issues by investigating entry patterns in the disk-drive industry. We investigate the effect of experience in existing markets and experience with previous market entry. We find that experience in previous markets increased the probability that a firm would enter a new market. We show that this experience had greater value if the firm entered the new market. We infer that managers chose to enter these markets to obtain this increase in value.
Article
Full-text available
The objective of this paper is to contribute to the empirical literature that evaluates the effects of public R&D support on private R&D investment. We apply a matching approach to analyze the effects of public R&D support in Spanish manufacturing firms. We examine whether or not the effects are different depending on the size of the firm and the technological level of the sectors in which the firms operate. We evaluate the effect of R&D subsidies on the subsidized firms, considering both the effect of subsidies on firms that would have performed R&D in the absence of public support and also the effect of inducement to undertake R&D activities. We also analyze the effect that concession of subsidies might have on firms which do not enjoy this type of support. The main conclusions indicate absence of “crowding-out”, either full or partial, between public and private spending and that some firms – mainly small and operating in low technology sectors – might not have engaged in R&D activities in the absence of subsidies.
Article
Full-text available
This paper demonstrates that the traditional categorization of innovation as either incremental or radical is incomplete and potentially misleading and does not account for the sometimes disastrous effects on industry incumbents of seemingly minor improvements in technological products. We examine such innovations more closely and, distinguishing between the components of a product and the ways they are integrated into the system that is the product "architecture," define them as innovations that change the architecture of a product without changing its components. We show that architectural innovations destroy the usefulness of the architectural knowledge of established firms, and that since architectural knowledge tends to become embedded in the structure and information-processing procedures of established organizations, this destruction is difficult for firms to recognize and hard to correct. Architectural innovation therefore presents established organizations with subtle challenges that may have significant competitive implications. We illustrate the concept's explanatory force through an empirical study of the semiconductor photolithographic alignment equipment industry, which has experienced a number of architectural innovations.
Article
This article develops and estimates an industry equilibrium model of the Korean electric motor industry from 1991 to 1996. Plant‐level decisions on R&D, physical capital investment, entry, and exit are integrated in a dynamic setting with knowledge spillovers. We apply the novel approximation of oblivious equilibrium to estimate the R&D cost, magnitude of knowledge spillovers, adjustment costs of physical investment, and plant scrap value distribution. Knowledge spillovers are essential to explaining the firm‐level productivity evolution and the equilibrium market configuration. A R&D subsidy maximizes industry output and is broadly consistent with a past policy initiative of the Korean government.
Article
This article discusses what determines the effect of technological discontinuities on the competitive positions of companies within an industry. Three cases of technological change are analyzed: the change from manual to computer numerically controlled metal cutting machine tools; the change from stand-alone machine tools to flexible manufacturing systems; and the change from non-cellular to cellular mobile telephony. It is argued that the character of technological discontinuity affects market shares, by altering the barriers to entry and mobility, and by being more or less in accordance with the different firms' vision about the future, implying variations in the time needed to detect and accept the new threat or opportunity. A technological discontinuity that involves a new generic technology which substitutes form, rather than adds to, the previous technology base is seen as disruptive. The time actually available for detecting the need to change and to act is limited by the market growth of the new product (the "speed of diffusion' between users). The faster the diffusion is, therefore, the greater is the likelihood that early movers will gain initial advantages. -from Authors
Article
I describe a model of entry with endogenous product-type choices. These choices are formalized as the outcomes of a game of incomplete information in which rivals' differentiated products have nonuniform competitive effects on profits. I estimate the model for location choices in the video retail industry using a nested fixed-point algorithm solution. The results imply significant returns to product differentiation. Simulations illustrate the tradeoff between demand and intensified competition and the extent to which markets with more scope for differentiation support greater entry.
Article
This paper examines the probability and timing of entry by industry incumbents into emerging technical subfields. When a new technical subfield of an industry emerges, an industry incumbent faces opposing entry incentives, either to wait until technical and market uncertainties subside or to stake out a strong position early. This paper argues that an incumbent is likely to enter a new subfield if the firm's core products are threatened or if it possesses industry-specialized supporting assets. The greater the competitive threat, the less likely an incumbent is to enter but the earlier it will do so. The predictions are supported with analysis of 30 years of entry data from five subfields of the American medical diagnostic imaging industry.
Article
How are prices set in the American automobile oligopoly? This paper seeks empirical estimates of the extent of departure from marginal-cost pricing and of the effects of foreign competition. The model estimated has product differentiation, multiproduct firms, and heterogeneity in consumer tastes. The estimation presumes that product type (proxied by engineering specifications) is exogenous to the price/quantity market equilibrium. Cross-section results for the 1977 and 1978 model years yield price-cost margins around 10%. Import competition has the effect of lowering equilibrium margins for compact and subcompact models.
Article
In its early years, the disk drive industry was led by a group of large-scale, integrated firms of the sort that Alfred D. Chandler, Jr., observed in his studies of several of the world's largest industries. The purpose of this history is to explore why it was so difficult for the leading disk drive manufacturers to replicate their success when technology and the structure of markets changed. The most successful firms aggressively developed the new component technologies required to address their leading customers’ needs, but this attention caused leading drive makers to ignore a sequence of emerging market segments, where innovative disk drive technologies were deployed by new entrants. As the performance of these new-architecture products improved at a rapid pace, the new firms were eventually able to conquer established markets as well. As a consequence, most of the integrated firms that established the disk drive industry were driven from it, displaced by networks of tightly focused, less integrated independent companies.
Article
When radical technological change transforms an industry established firms sometimes fail drastically and are displaced by new entrants, yet other times survive and prosper. Drawing upon an unusually rich data set that covers the technological and competitive history of the typesetter industry from 1886 to 1990, this paper uses a combination of quantitative and qualitative analysis to unravel this process of creative destruction. It argues that the ultimate commercial performance of incumbents vs. new entrants is driven by the balance and interaction of three factors: investment, technical capabilities, and appropriability through specialized complementary assets. In this industry, specialized complementary assets played a crucial role in buffering incumbents from the effects of competence destruction, and an analysis that examined investment or technical capabilities in isolation would have led to misleading results. This work thus highlights the importance of considering multiple perspectives when examining the competitive implications of technological change. © 1997 by John Wiley & Sons, Ltd.
Article
A model (developed in 1975 to 1978) suggests how the character of a firm's innovation changes as a successful enterprise matures and how other companies may change themselves to foster innovation as they grow and prosper. This has become a highly cited paper.
Book
Most analysts of corporations and industries adopt the focal perspective of a single prototypical organization. Many analysts also study corporations primarily in terms of their internal organizational structures or as complex systems of financial contracts. Glenn Carroll and Michael Hannan bring fresh insight to our understanding of corporations and the industries they comprise by looking beyond prototypical structures to focus on the range and diversity of organizations in their social and economic setting. The result is a rich rendering of analysis that portrays whole populations and communities of corporations.The Demography of Corporations and Industries is the first book to present the demographic approach to organizational studies in its entirety. It examines the theory, models, methods, and data used in corporate demographic research. Carroll and Hannan explore the processes by which corporate populations change over time, including organizational founding, growth, decline, structural transformation, and mortality. They review and synthesize the major theoretical mechanisms of corporate demography, ranging from aging and size dependence to population segregation and density dependence. The book also explores some selected implications of corporate demography for public policy, including employment and regulation.In this path-breaking book, Carroll and Hannan demonstrate why demographic research on corporations is important; describe how to conduct demographic research; specify fruitful areas of future research; and suggest how the demographic perspective can enrich the public discussion of issues surrounding the corporation in our constantly evolving industrial society. All researchers and analysts with an interest in this topic will find The Demography of Corporations and Industries an invaluable resource.
Chapter
This chapter reviews the empirical literature on the determination of firms’ and industries’ innovative activity and performance, highlighting the questions addressed, the approaches adopted, impediments to progress in the field, and research opportunities. We review the “neo-Schumpeterian” empirical literature that examines the effects of firm size and market concentration upon innovation, focusing on robust findings, questions of interpretation, and the identification of major gaps. We also consider the more modest literature that considers the effect on innovation of firm characteristics other than size. Finally, we review the literature that considers three classes of factors that affect interindustry variation in innovative activity and performance: demand, appropriability, and technological opportunity conditions.
Article
Estimating structural models is often viewed as computationally difficult, an impression partly due to a focus on the nested fixed-point (NFXP) approach. We propose a new constrained optimization approach for structural estimation. We show that our approach and the NFXP algorithm solve the same estimation problem, and yield the same estimates. Computationally, our approach can have speed advantages because we do not repeatedly solve the structural equation at each guess of structural parameters. Monte Carlo experiments on the canonical Zurcher bus-repair model demonstrate that the constrained optimization approach can be significantly faster.
Article
This article investigates the welfare implications of the rapid innovation in central processing units (CPUs), and asks whether it results in inefficient elimination of basic personal computer (PC) products. I analyse a game in which firms make multiple discrete product choices, and tackle challenges such as partial identification and sample selection. Estimation results demonstrate that the demand for PCs is highly segmented, and that fixed costs consume a substantial portion of the per-unit producer profit. The estimated model implies that Intel's introduction of its Pentium M chip contributed significantly to the growth of the mobile PC segment and to consumer welfare. The lion's share of the benefits to consumers was enjoyed by the 20% least price-sensitive consumers. I also find that the Pentium M crowded out the Pentium III and Pentium 4 technologies, and that the benefits to consumers from keeping those older products on the shelf would have been comparable to the added fixed costs. While total welfare cannot be increased by keeping older technologies on the shelf, such a policy would have allowed the benefits from innovation to “trickle down” to price-sensitive households, improving their access to mobile computing.
Article
We estimate an equilibrium model of dynamic oligopoly with durable goods and endogenous innovation to examine the effect of competition on innovation in the personal computer microprocessor industry. Firms make dynamic pricing and investment decisions while consumers make dynamic upgrade decisions, anticipating product improvements and price declines. Consistent with Schumpeter, we find that the rate of innovation in product quality would be 4.2 percent higher without AMD present, though higher prices would reduce consumer surplus by $12 billion per year. Comparative statics illustrate the role of product durability and provide implications of the model for other industries.
Article
This paper develops a method for inference in dynamic discrete choice models with serially correlated unobserved state variables. Estimation of these models involves computing high-dimensional integrals that are present in the solution to the dynamic program and in the likelihood function. First, the paper proposes a Bayesian Markov chain Monte Carlo estimation procedure that can handle the problem of multidimensional integration in the likelihood function. Second, the paper presents an efficient algorithm for solving the dynamic program suitable for use in conjunction with the proposed estimation procedure.
Article
In many industries, one important method of diffusion is through employee mobility: many of the entering firms are started by employees from incumbent firms using some of their former employer's technological know-how. This article explores the effect of incorporating this mechanism in a general industry framework by allowing employees to imitate their employers' know-how. The equilibrium is Pareto optimal because the employees “pay” for the possibility of learning their employers' know-how. The model's implications are consistent with data from the rigid disk drive industry. These implications concern the effects of know-how on firm formation and survival.
Article
This paper studies a competitive dynamic model with firm level uncertainty and derives implications for the distribution of firm values and Tobin's q. Allowing for entry and exit, the model determines endogenously the degree of selection. A consequence of this selection is that average industry q values are biased above one. As parameters describing the technology and firm level uncertainty are changed, the equilibrium distribution for q values changes. This comparative statics is developed in the paper.
Article
We analyze the evolution of four new products that experienced an initial rise and then extreme shakeout in their number of manufacturers: automobiles, tires, televisions, and penicillin. Data on entry, exit, and innovation are collected for each product to test theories of industry shakeouts. Hazard analyses indicate that earlier entrants had persistently lower hazards during the shakeouts, which was related to their greater rates of innovation. Our findings suggest shakeouts are not triggered by particular technological or other events but are part of a competitive process in which the most able early entrants achieve dominant market positions through innovation.
Article
This paper reviews a framework for numerically analyzing dynamic interactions in imperfectly competitive industries. The framework dates back to Ericson and Pakes [1995. Review of Economic Studies 62, 53–82], but it is based on equilibrium notions that had been available for some time before, and it has been extended in many ways by different authors since. The framework requires as input a set of primitives which describe the institutional structure in the industry to be analyzed. The framework outputs profits and policies for every incumbent and potential entrant at each possible state of the industry. These policies can be used to simulate the distribution of sample paths for all firms from any initial industry structure. The sample paths generated by the model can be quite different depending on the primitives, and most of the extensions were designed to enable the framework to accommodate empirically relevant cases that required modification of the initial structure. The sample paths possess similar properties to those observed in (the recently available) panel data sets on industries. These sample paths can be used either for an analysis of the likely response to a policy or an environmental change, or as the model's implication in an estimation algorithm. We begin with a review of an elementary version of the framework and a report on what is known about its analytic properties. Much of the rest of the paper deals with computational issues. We start with an introduction to iterative techniques for computing equilibrium that are analogous to the techniques used to compute the solution to single agent dynamic programming problems. This includes discussions of the determinants of the computational burden of these techniques, and the mechanism implicitly used to select an equilibrium when multiple equilibria are possible. We then outline a number of techniques that might be used to reduce the computational burden of the iterative algorithm. This section includes discussions of both the implications of differences in modeling assumptions used in the alternative techniques, and a discussion of the likely relevance of the different techniques for different institutional structures. A separate section reports on a technique for computing multiple equilibria from the same set of primitives. The paper concludes with a review of applications of the framework and a brief discussion of areas where further development of the framework would seem warranted.
Article
A number of market failures have been associated with R&D investments and significant amounts of public money have been spent on programs to stimulate innovative activities. In this paper, we review some recent microeconometric studies evaluating effects of government-sponsored commercial R&D. We pay particular attention to the conceptual problems involved. Neither the firms receiving support, nor those not applying, constitute random samples. Furthermore, those not receiving support may be affected by the programs due to spillover effects which often are the main justification for R&D subsidies. Constructing a valid control group under these circumstances is challenging, and we relate our discussion to recent advances in econometric methods for evaluation studies based on non-experimental data. We also discuss some analytical questions, beyond these estimation problems, that need to be addressed in order to assess whether R&D support schemes can be justified. For instance, what are the implications of firms' R&D investments being complementary to each other, and to what extent are potential R&D spillovers internalized in the market?
Article
This paper outlines recently developed techniques for estimating the primitives needed to empirically analyze equilibrium interactions and their implications in oligopolistic markets. It is divided into an introduction and three sections; a section on estimating demand functions, a section on estimating production functions, and a section on estimating “dynamic” parameters (parameters estimated through their implications on the choice of controls which determine the distribution of future profits).The introduction provides an overview of how these primitives are used in typical I.O. applications, and explains how the individual sections are structured. The topics of the three sections have all been addressed in prior literature. Consequently each section begins with a review of the problems I.O. researchers encountered in using the prior approaches. The sections then continue with a fairly detailed explanation of the recent techniques and their relationship to the problems with the prior approaches. Hopefully the detail is rich enough to enable the reader to actually program up a version of the techniques and use them to analyze data. We conclude each section with a brief discussion of some of the problems with the more recent techniques. Here the emphasis is on when those problems are likely to be particularly important, and on recent research designed to overcome them when they are.
Book
Analyzes how successful firms fail when confronted with technological and market changes, prescribing a list of rules for firms to follow as a solution. Precisely because of their adherence to good management principles, innovative, well-managed firms fail at the emergence of disruptive technologies - that is, innovations that disrupt the existing dominant technologies in the market. Unfortunately, it usually does not make sense to invest in disruptive technologies until after they have taken over the market. Thus, instead of exercising what are typically good managerial decisions, at the introduction of technical or market change it is very often the case that managers must make counterintuitive decisions not to listen to customers, to invest in lower-performance products that produce lower margins, and to pursue small markets. From analysis of the disk drive industry, a set of rules is devised - the principles of disruptive innovation - for managers to measure when traditional good management principles should be followed or rejected. According to the principles of disruptive innovation, a manager should plan to fail early, often, and inexpensively, developing disruptive technologies in small organizations operating within a niche market and with a relevant customer base. A case study in the electric-powered vehicles market illustrates how a manager can overcome the challenges of disruptive technologies using these principles of disruptive innovation. The mechanical excavator industry in the mid-twentieth century is also described, as an example in which most companies failed because they were unwilling to forego cable excavator technology for hydraulics machines. While there is no "right answer" or formula to use when reacting to unpredictable technological change, managers will be able to adapt as long as they realize that "good" managerial practices are only situationally appropriate. Though disruptive technologies are inherently high-risk, the more a firm invests in them, the more it learns about the emerging market and the changing needs of consumers, so that incremental advances may lead to industry-changing leaps. (CJC)
Article
The author raises the classic question of welfare economics in relation to invention: to what extent does perfect competition result in optimal allocation of resources? There are three reasons for the possible failure of perfect competition to lead to optimal resource allocation: marginal-cost pricing, divergence between social and private benefit (or cost), and allocation of resources under uncertainty. The last receives attention in this chapter; specifically, only in the context of uncertainty arises the critical idea of information. Improving the efficiency of the economy with respect to risk may decrease technical efficiency. Devices for mitigating adverse effects of insurance are co-insurance and cost-plus contracts. Uncertainty creates a subtle problem in resource allocation: information becomes a commodity with economic value, and the economic characteristics of information as a commodity, and of invention as a process for the production of information are examined. The classic problem of indivisible commodities applies to information, and the problem of allocation in the presence of indivisibilities appears. The costs of transmitting information create difficulties in allocation. Invention is a process full of risk. Research by corporations is one way to reduce risk. Turning invention into property rights results in underutilization of information. Profitability of invention thus leads to non-optimal resource allocation The failure of a competitive system to achieve an optimal resource allocation is shown to be due to all three reasons (stated above). Incentives to invent can exist for monopolistic and competitive markets. A model is developed, and theoretical reasons are given to explain the biases that result in the misallocations and inefficiencies in the economic system. Some further implications for alternative forms of economic organization are offered. Optimal and efficient allocation of invention could require government or other non-profit finance, and provisions for innovation by individual talents (rather than by firms) could be devised; problems with these approaches are also noted. (TNM)
Article
Proposes a theory that explains why smaller firms have higher and more variable growth rates than larger firms. Relying on employer heterogeneity and market selection to generate patterns of employer growth and failure, the model states that efficient firms grow and survive while inefficient firms decline and fail, regardless of firm size. However, firms that fail are actually firms that, if given more time to succeed, would have grown more slowly. These slow growing firms are most often smaller firms. Also provided is a behavior characterization of entry and prices in equilibrium, which is defined as a pair of functions that characterize optimal output and exit behavior of firms. (SFL)
Article
Learning-by-doing and organizational forgetting are empirically important in a variety of industrial settings. This paper provides a general model of dynamic competition that accounts for these fundamentals and shows how they shape industry structure and dynamics. We show that forgetting does not simply negate learning. Rather, they are distinct economic forces that interact in subtle ways to produce a great variety of pricing behaviors and industry dynamics. In particular, a model with learning and forgetting can give rise to aggressive pricing behavior, varying degrees of long-run industry concentration ranging from moderate leadership to absolute dominance, and multiple equilibria. Copyright 2010 The Econometric Society.