Article

A note on vertical search engines' foreclosure

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... First of all, network effects in search are combined with a form of learning by doing (or learning by searching). This is due to technological reasons: search engines find more relevant results for each query when there are more queries and the subsequent clicks of the users provide information on what were the most relevant websites associated with particular keywords (see also Tarantino, 2011). Through this feedback mechanism, the same users improve the algorithms that govern the search engine, and the impact is relatively bigger on tail queries (compared to the most common queries on which all search engines reach a relatively large scale of queries). ...
... On this front, the Italian and French competition authorities have obliged Google to guarantee that press publishers will be able to request and obtain exclusion from Google News, but without being de-listed from the general search. On the antitrust front, the new services of Google have rapidly gained success over competing vertical search services, but possibly with the help of manipulated ranking in the natural search of Google, which ends up marginalizing any competing vertical engine (Tarantino, 2011). 10 As long as a dominant search platform gives prior-9 Notice that this is quite different from what emerges in some of the other software platforms where multihoming on both sides is often not necessary (think of the market for video games, where consumers do not multihome and choose a single platform, but multihoming by game developers guarantees the process of development of network effects for all competing platforms). ...
Article
Full-text available
This article analyzes the role of leadership in multi-sided markets. Search and display advertising are better characterized by (respectively) quantity and price com-petition with leadership by a dominant firm and few followers. A platform that reached dominance in search may have an incentive to limit services to consumers to be aggres-sive with the advertisers, or may be more likely to exploit its scale in search to build barriers to entry and to adopt click-weighted auctions to manipulate the pricing of sponsored links. A dominant platform in display advertising may increase the rewards of content providers to increase prices on advertisers, or may adopt exclusive clauses to predate on other platforms. This creates the potential for multiple antitrust issues. other participants to the IV and V Intertic Conferences, to the III Conference on Recent Developments in Antitrust Policy and to the 2011 Cresse Conference for useful discussion on the topic.
... Second, by tweaking on a search engine the "organic" or "unsponsored" search results to obfuscate rivals (see Tarantino (2011)). "Unfair" (whatever that means) ranking of search results could put a firm out of business in a world in which being on page 2 of a search query is pretty much like not existing. ...
... On this front, the Italian and French competition authorities have obliged Google to guarantee that press publishers will be able to request and obtain exclusion from Google News, but without being de-listed from the general search. On the antitrust front, the new services of Google have rapidly gained success over competing vertical search services, but possibly with the help of manipulated ranking in the natural search of Google, which ends up marginalizing any competing vertical engine (Tarantino, 2011). As long as a dominant search platform gives priority to its own vertical services and diverts traffic away from its competitors, it is destined to reach leadership in any service provided, hurting competition on the merit. ...
Article
I analyze the role of leadership in multi-sided markets as online advertising. Search and display advertising are better characterized by (respectively) quantity and price competition. A platform that reached dominance in search may have an incentive to limit services to consumers to be aggressive with the advertisers, to exploit its scale in search to build barriers to entry, or to adopt click-weighted auctions to manipulate the pricing of sponsored links. On the other side, a dominant platform in display advertising may increase the rewards of content providers to increase prices on advertisers, or may adopt exclusive clauses to predate on other platforms.
Article
Vertical search is an important topic in the design of search engines as it offers more abundant and more precise results on specific domain compared with large-scale search engines, like Google and Baidu. Prior to this paper, most vertical search engines were built using manually selected and edited materials, which was time and money consuming. In this paper, we propose a new information resource discovery model and build a crawler in the vertical search engine, which can selectively fetch webpages relevant to a pre-defined topic. The model includes three aspects. First, webpages are transformed into term vectors. TF-TUF, short for Term Frequency-Topic Unbalanced Factor, is proposed as the weighting schema in vector space model. In the schema, we put more weight on terms whose frequencies differ a lot among topics, which will contribute more in the topic prediction we believe. Second, we use Bayes method to predict the topics of the webpages, where topic labeled text is used for training in advance. The specific method about using Bayes to predict the topic is illustrated in the algorithm section. Third, we create a focused crawler using the topic prediction result. The prediction result is used not only to filter the irrelevant webpages but also to direct the crawler to the areas, which are most possible to be topic relevant. The whole three aspects work together to reach the goal of discovering the topic relevant materials on the web efficiently, in building a vertical search engine. Our experiment shows that the average prediction accuracy of our proposed model can reach more than 85%. For application, we also used the proposed model to build "Search Engine for S&T" (http://nstr.com.cn/search), a vertical search engine in science field.
Article
Full-text available
The antitrust landscape has changed dramatically in the last decade. Within the last two years alone, the United States Department of Justice has held hearings on the appropriate scope of Section 2, issued a comprehensive Report, and then repudiated it; and the European Commission has risen as an aggressive leader in single firm conduct enforcement by bringing abuse of dominance actions and assessing heavy fines against firms including Qualcomm, Intel, and Microsoft. In the United States, two of the most significant characteristics of the “new” antitrust approach have been a more intense focus on innovative companies in high-tech industries and a weakening of longstanding concerns that erroneous antitrust interventions will hinder economic growth. But this focus is dangerous, and these concerns should not be dismissed so lightly. In this article we offer a comprehensive cautionary tale in the context of a detailed factual, legal and economic analysis of the next Microsoft: the theoretical, but perhaps imminent, enforcement action against Google. Close scrutiny of the complex economics of Google’s technology, market and business practices reveals a range of real but subtle, pro-competitive explanations for features that have been held out instead as anticompetitive. Application of the relevant case law then reveals a set of concerns where economic complexity and ambiguity, coupled with an insufficiently-deferential approach to innovative technology and pricing practices in the most relevant precedent (the D.C. Circuit’s decision in Microsoft), portend a potentially erroneous - and costly - result. Our analysis, by contrast, embraces the cautious and evidence-based approach to uncertainty, complexity and dynamic innovation contained within the well-established “error cost framework.” As we demonstrate, while there is an abundance of error-cost concern in the Supreme Court precedent, there is a real risk that the current, aggressive approach to antitrust error, coupled with the uncertain economics of Google’s innovative conduct, will nevertheless yield a costly intervention. The point is not that we know that Google’s conduct is procompetitive, but rather that the very uncertainty surrounding it counsels caution, not aggression.
Article
The market for Internet search is not only economically and socially important, it is also highly concentrated. Is this a problem? We study the question whether "competition is only a free click away". We argue that the market for Internet search is characterized by indirect network externalities and construct a simple model of search engine competition, which produces a market share development that fits the empirically observed development since 2003 well. We find that there is a strong tendency towards market tipping and, subsequently, monopolization, with negative consequences on economic welfare. Therefore, we propose to require search engines to share their data on previous searches. We compare the resulting "competitive oligopoly" market structure with the less competitive current situation and show that our proposal would spur innovation, search quality, consumer surplus, and total welfare. We also discuss the practical feasibility of our policy proposal and sketch the legal issues involved.
Article
This lecture provides an introduction to the economics of Internet search engines. After a brief review of the historical development of the technology and the industry, I describe some of the economic features of the auction system used for displaying ads. It turns out that some relatively simple economic models provide significant insight into the operation of these auctions. In particular, the classical theory of two-sided matching markets turns out to be very useful in this context.
Article
In many negotiations, rules are soft in the sense that the seller and/or buyers may break them at some cost. When buyers have private values, we show that the cost of such opportunistic behavior (whether by the buyers or the seller) is borne entirely by the seller in equilibrium, in the form of lower revenues. Consequently, the seller is willing to pay an auctioneer to credibly commit to a mechanism in which no one has the ability or the incentive to break the rules. Examples of “costly rule bending” considered here include hiring shill bidders and trying to learn others' bids before making one's own.
Article
When identical firms pay a fee to list prices at a price comparison site and can price discriminate between consumers who do and don’t use the site, prices listed at the site are dispersed but lower than at firms’ own websites
Article
Internet-based technologies are revolutionizing the stodgy $625 billion global advertising industry. There are a number of public policy issues to consider. Will a single ad platform emerge or will several remain viable? What are the consequences of alternative market structures for a web economy that is increasingly based on selling eyeballs to advertisers? This article describes the online advertising industry. The industry is populated by a number of multi-sided platforms that facilitate connecting advertisers to viewers. Search-based advertising platforms, the most developed of these, have interesting economic features that result from the combination of keyword bidding by advertisers and single-homing.