The Sherman Antitrust Law is the principal federal statute regulating anti-competitive conduct. This volume examines the rich legislative history and political economy of the Act and the current debates which surround it. The contributors include: Robert Bork, Richard Posner, Charles Rule, Louis Kaplow, and Eleanor Fox.
The structural presumption against mergers in highly concentrated markets has had a controversial history. The debate has centered on the error rates – and especially, the rate of false positives – from reliance on a simple structural standard for presuming a merger to be anticompetitive. This article introduces the first systematic evidence into that debate. It also uses that evidence to examine the empirical validity of the so-called “safe harbor”–the range of concentration where mergers are presumed unlikely to harm competition. The evidence is based on a substantial compilation of carefully studied mergers whose competitive outcomes are matched to data on concentration, the change in concentration due to the merger, and the number of remaining significant competitors after each merger. Statistical tests confirm that the current threshold in the Horizontal Merger Guidelines correctly identifies anticompetitive mergers with a high degree of accuracy, as indeed would somewhat tighter standards. Corollary findings include indications that a count of remaining significant competitors produces a slightly higher rate of correct predictions, and that the concentration-based safe harbor is at best rough guidance since there are any number of anticompetitive mergers in that zone. Data on recent merger enforcement practice are contrasted with these findings.
Efficiencies have long been an important aspect of the antitrust analysis of mergers, but in recent years there have been certain noteworthy changes in the nature and analysis of efficiencies. One change appears to be the greater ability of the antitrust agencies to evaluate conventional efficiencies carefully and critically. This is demonstrated by the analysis of efficiencies associated with a number of recent proposed mergers before the antitrust and regulatory agencies. The second important change concerns the types of efficiencies typically claimed by merging firms. Whereas in earlier years these largely involved scale economies, they now more often focus on dynamic efficiencies, quality benefits, and vertical economies. This article discusses each of these in turn and argues that these newer types of efficiencies pose new and greater challenges for analysis.
This article reviews the formulation and evolution of the Philadelphia National Bank anticompetitive presumption through the lens of decision theory and Bayes Law. It explains how the economic theory, empirical evidence and experience are used to determine a presumption and how that presumption interacts with the reliability of relevant evidence to rationally set the appropriate burden of production and burden of persuasion to rebut the presumption. The article applies this reasoning to merger presumptions. It also sketches out a number of non-market share structural factors that might be used to supplement or replace the current legal and enforcement presumptions for mergers. It also discusses the potential for conflicting presumptions and how such conflicts might best be resolved.
This is one of the first articles to demonstrate that the primary goal of antitrust is neither exclusively to enhance economic efficiency, nor to address any social or political factor. Rather, the overriding intent behind the merger laws was to prevent prices to purchasers from rising due to mergers (a wealth transfer concern). This is the first article to show how to analyze mergers with this goal in mind. Doing so challenges the fundamental underpinnings of Williamsonian merger analysis (which assumes mergers should be evaluated only in terms of net efficiency effects).In this and three related articles we re-do the Williamsonian tradeoff, using price to consumers, instead of net efficiencies, as our focus. We show how a price focus would require substantially more efficiencies to justify an otherwise anticompetitive merger. We demonstrate this by calculating how large the necessary efficiency gains would have to be to prevent price increases under different market conditions. We also show that under efficiency analysis even a tiny cost savings would justify almost any merger, so a purely efficiency-based approach effectively would negate the anti-merger laws! Finally, we demonstrate that price analysis is much more predictable and easier to administer than efficiency analysis.
Cost-effectiveness analysis, which ranks projects by quality adjusted life years gained per dollar spent, is widely used in the evaluation of health interventions. We show that cost effectiveness analysis can be derived from two axioms: society prefers Pareto improvements and society values discounted life years, lived in perfect health, equally for each person. These axioms generate a unique social preference ordering, allowing us to find the cost effectiveness threshold to which health projects should be funded, and to extend cost effectiveness analysis to give a consistent method of project evaluation across all sectors of the economy.
Comments of Economists and Lawyers on the Draft Merger Guidelines n
Jan 2023
Crane
The Political Economy of the Decline of Antitrust Enforcement in the United States, 85 antitrUSt l
Jan 1941
500
Sallet Rose
Warren-Boulton was Chief Economist for the U.S. Department of Justice's Antitrust Division from
Jan 1983
Warren-Boulton
The DOJ and FTC's Misguided Attack on Mergers
Jan 2023
Epstein
Philadelphia National Bank at 50: An Interview with Judge Richard Posner, 80 antitrUSt l
Jan 2015
Hemphill
The Merger Incipiency Doctrine and the Importance of "Redundant" Competitors
Jan 2018
Lande Carstensen
The D.C. Circuit cited Procter & Gamble. For other Third Circuit examples, see FTC v. Penn State Hershey Med