J. Doyne Farmer

University of Oxford, Oxford, England, United Kingdom

Are you J. Doyne Farmer?

Claim your profile

Publications (83)172.93 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: We make an extensive empirical study of the market impact of large orders (metaorders) executed in the U.S. equity market between 2007 and 2009. We show that the square root market impact formula, which is widely used in the industry and supported by previous published research, provides a good fit only across about two orders of magnitude in order size. A logarithmic functional form fits the data better, providing a good fit across almost five orders of magnitude. We introduce the concept of an "impact surface" to model the impact as a function of both the duration and the participation rate of the metaorder, finding again a logarithmic dependence. We show that during the execution the price trajectory deviates from the market impact, a clear indication of non-VWAP executions. Surprisingly, we find that sometimes the price starts reverting well before the end of the execution. Finally we show that, although on average the impact relaxes to approximately 2/3 of the peak impact, the precise asymptotic value of the price depends on the participation rate and on the duration of the metaorder. We present evidence that this might be due to a herding phenomenon among metaorders.
    12/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Order flow in equity markets is remarkably persistent in the sense that order signs (to buy or sell) are positively autocorrelated out to time lags of tens of thousands of orders, corresponding to many days. Two possible explanations are herding, corresponding to positive correlation in the behavior of different investors, or order splitting, corresponding to positive autocorrelation in the behavior of single investors. We investigate this using order flow data from the London Stock Exchange for which we have membership identifiers. By formulating models for herding and order splitting, as well as models for brokerage choice, we are able to overcome the distortion introduced by brokerage. On timescales of less than a few hours the persistence of order flow is overwhelmingly due to splitting rather than herding. We also study the properties of brokerage order flow and show that it is remarkably consistent both cross-sectionally and longitudinally.
    Journal of Economic Dynamics and Control 11/2014; · 0.86 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Since beginning of the 2008 financial crisis almost half a trillion euros have been spent to financially assist EU member states in taxpayer-funded bail-outs. These crisis resolutions are often accompanied by austerity programs causing political and social friction on both domestic and international levels. The question of how to resolve failing financial institutions under which economic preconditions is therefore a pressing and controversial issue of vast political importance. In this work we employ an agent-based model to study the economic and financial ramifications of three highly relevant crisis resolution mechanisms. To establish the validity of the model we show that it reproduces a series of key stylized facts if the financial and real economy. The distressed institution can either be closed via a purchase & assumption transaction, it can be bailed-out using taxpayer money, or it may be bailed-in in a debt-to-equity conversion. We find that for an economy characterized by low unemployment and high productivity the optimal crisis resolution with respect to financial stability and economic productivity is to close the distressed institution. For economies in recession with high unemployment the bail-in tool provides the most efficient crisis resolution mechanism. Under no circumstances do taxpayer-funded bail-out schemes outperform bail-ins with private sector involvement.
    Journal of Economic Dynamics and Control 03/2014; · 0.86 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: We study the problem of interacting channels of contagion in financial networks. The first channel of contagion is counterparty failure risk; this is captured empirically using data for the Austrian interbank network. The second channel of contagion is overlapping portfolio exposures; this is studied using a stylized model. We perform stress tests according to different protocols. For the parameters we study neither channel of contagion results in large effects on its own. In contrast, when both channels are active at once, bankruptcies are much more common and have large systemic effects.
    Journal of Economic Dynamics and Control 01/2014; · 0.86 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: For environmental problems such as global warming future costs must be balanced against present costs. This is traditionally done using an exponential function with a constant discount rate, which reduces the present value of future costs. The result is highly sensitive to the choice of discount rate and has generated a major controversy as to the urgency for immediate action. We study analytically several standard interest rate models from finance and compare their properties to empirical data. From historical time series for nominal interest rates and inflation covering 14 countries over hundreds of years, we find that extended periods of negative real interest rates are common, occurring in many epochs in all countries. This leads us to choose the Ornstein-Uhlenbeck model, in which real short run interest rates fluctuate stochastically and can become negative, even if they revert to a positive mean value. We solve the model in closed form and prove that the long-run discount rate is always less than the mean; indeed it can be zero or even negative, despite the fact that the mean short term interest rate is positive. We fit the parameters of the model to the data, and find that nine of the countries have positive long run discount rates while five have negative long-run discount rates. Even if one rejects the countries where hyperinflation has occurred, our results support the low discounting rate used in the Stern report over higher rates advocated by others.
    SSRN Electronic Journal 11/2013;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In spite of the growing theoretical literature on cascades of failures in interbank lending networks, empirical results seem to suggest that networks of direct exposures are not the major channel of financial contagion. In this paper we show that networks of interbank exposures can however significantly amplify contagion due to overlapping portfolios. To illustrate this point, we consider the case of the Austrian interbank network and perform stress tests on it according to different protocols. We consider in particular contagion due to (i) counterparty loss; (ii) roll-over risk; and (iii) overlapping portfolios. We find that the average number of bankruptcies caused by counterparty loss and roll-over risk is fairly small if these contagion mechanisms are considered in isolation. Once portfolio overlaps are also accounted for, however, we observe that the network of direct interbank exposures significantly contributes to systemic risk.
    06/2013;
  • J. Doyne Farmer, Spyros Skouras
    Quantitative Finance 03/2013; 13(3). · 0.82 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We use a simple agent based model of value investors in financial markets to test three credit regulation policies. The first is the unregulated case, which only imposes limits on maximum leverage. The second is Basle II, which also imposes interest rate spreads on loans and haircuts on collateral, and the third is a hypothetical alternative in which banks perfectly hedge all of their leverage-induced risk with options that are paid for by the funds. When compared to the unregulated case both Basle II and the perfect hedge policy reduce the risk of default when leverage is low but increase it when leverage is high. This is because both regulation policies increase the amount of synchronized buying and selling needed to achieve deleveraging, which can destabilize the market. None of these policies are optimal for everyone: Risk neutral investors prefer the unregulated case with a maximum leverage of roughly four, banks prefer the perfect hedge policy, and fund managers prefer the unregulated case with a high maximum leverage. No one prefers Basle II.
    Journal of Banking & Finance 01/2013; · 1.29 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Forecasting technological progress is of great interest to engineers, policy makers, and private investors. Several models have been proposed for predicting technological improvement, but how well do these models perform? An early hypothesis made by Theodore Wright in 1936 is that cost decreases as a power law of cumulative production. An alternative hypothesis is Moore's law, which can be generalized to say that technologies improve exponentially with time. Other alternatives were proposed by Goddard, Sinclair et al., and Nordhaus. These hypotheses have not previously been rigorously tested. Using a new database on the cost and production of 62 different technologies, which is the most expansive of its kind, we test the ability of six different postulated laws to predict future costs. Our approach involves hindcasting and developing a statistical model to rank the performance of the postulated laws. Wright's law produces the best forecasts, but Moore's law is not far behind. We discover a previously unobserved regularity that production tends to increase exponentially. A combination of an exponential decrease in cost and an exponential increase in production would make Moore's law and Wright's law indistinguishable, as originally pointed out by Sahal. We show for the first time that these regularities are observed in data to such a degree that the performance of these two laws is nearly the same. Our results show that technological progress is forecastable, with the square root of the logarithmic error growing linearly with the forecasting horizon at a typical rate of 2.5% per year. These results have implications for theories of technological change, and assessments of candidate technologies and policies for climate change mitigation.
    PLoS ONE 01/2013; 8(2):e52669. · 3.53 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We outline a vision for an ambitious program to understand the economy and financial markets as a complex evolving system of coupled networks of interacting agents. This is a completely different vision from that currently used in most economic models. This view implies new challenges and opportunities for policy and managing economic crises. The dynamics of such models inherently involve sudden and sometimes dramatic changes of state. Further, the tools and approaches we use emphasize the analysis of crises rather than of calm periods. In this they respond directly to the calls of Governors Bernanke and Trichet for new approaches to macroeconomic modelling. Graphical abstract
    The European Physical Journal Special Topics 11/2012; 214(1). · 1.80 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Common asset holdings are widely believed to have been the primary vector of contagion in the recent financial crisis. We develop a network approach to the amplification of financial contagion due to the combination of overlapping portfolios and leverage, and we show how it can be understood in terms of a generalized branching process. By studying a stylized model we estimate the circumstances under which systemic instabilities are likely to occur as a function of parameters such as leverage, market crowding, diversification, and market impact. Although diversification may be good for individual institutions, it can create dangerous systemic effects, and as a result financial contagion gets worse with too much diversification. Under our model there is a critical threshold for leverage; below it financial networks are always stable, and above it the unstable region grows as leverage increases. The financial system exhibits "robust yet fragile" behavior, with regions of the parameter space where contagion is rare but catastrophic whenever it occurs. Our model and methods of analysis can be calibrated to real data and provide simple yet powerful tools for macroprudential stress testing.
    Journal of Banking & Finance 10/2012; · 1.29 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The practice of valuation by marking-to-market with current trading prices is seriously flawed. Under leverage the problem is particularly dramatic: due to the concave form of market impact, selling always initially causes the expected leverage to increase. There is a critical leverage above which it is impossible to exit a portfolio without leverage going to infinity and bankruptcy becoming likely. Standard risk-management methods give no warning of this problem, which easily occurs for aggressively leveraged positions in illiquid markets. We propose an alternative accounting procedure based on the estimated market impact of liquidation that removes the illusion of profit. This should curb the leverage cycle and contribute to an enhanced stability of financial markets.
    04/2012;
  • Fabriziolillo, J.doyne Farmer
    [Show abstract] [Hide abstract]
    ABSTRACT: Recent empirical analyses have shown that liquidity fluctuations are important for understanding large price changes of financial assets. These liquidity fluctuations are quantified by gaps in the order book, corresponding to blocks of adjacent price levels containing no quotes. Here we study the statistical properties of the state of the limit order book for 16 stocks traded at the London Stock Exchange (LSE). We show that the time series of the first three gaps are characterized by fat tails in the probability distribution and are described by long memory processes.
    Fluctuation and Noise Letters 01/2012; 05(02). · 0.89 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Using a database of sixty-two different technologies, we study the issue of forecasting technological progress. We do so using the following methodology: pretending to be at a given time in the past, we forecast technology prices for years up to present day. Since our forecasts are in the past, we refer to it as hindcasting and analyze the predictions relative to what happened historically. We use hindcasting to evaluate a variety of different hypotheses for technological improvement. Our results indicate that forecasts using production are better than those using time. This conclusion is robust when analyzing randomly chosen subsets of our technology database. We then turn to investigating the interdependence of revenue and technological progress. We derive analytically an upper bound to the rate of technology improvement given the condition of increasing revenue and show empirically that all technologies fall within our derived bound. Our results suggest the observed advantage of using production models for forecasting is due in part to the direct relationship between production and revenue.
    01/2012;
  • Yonathan Schwarzkopf, J. Doyne Farmer
    [Show abstract] [Hide abstract]
    ABSTRACT: Mutual funds potentially face diminishing returns to scale due to the fact that trading cost per dollar increases with trading size. Nonetheless, empirical analysis shows that returns of mutual funds do not depend on fund size. Two theories have been offered to explain this: According to Berk and Green large funds have more pre-trading cost skill and more trading cost, in equal measures; according to Fama and French both skill and trading costs are close to zero, and therefore independent of size. We offer an alternative theory based on the assumption that fund managers are profit maximizers and investor selection forces the expected returns of funds to be independent of size. Using a reduced model we show empirically that trading costs per dollar traded do indeed increase with size. Large mutual funds compensate for this by charging lower fees and trading more stocks less frequently. Surprisingly, pre- cost skill is nearly independent of size, and is in fact slightly smaller for large funds.
    09/2011;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We consider a model of contagion in financial networks recently introduced in the literature, and we characterize the effect of a few features empirically observed in real networks on the stability of the system. Notably, we consider the effect of heterogeneous degree distributions, heterogeneous balance sheet size and degree correlations between banks. We study the probability of contagion conditional on the failure of a random bank, the most connected bank and the biggest bank, and we consider the effect of targeted policies aimed at increasing the capital requirements of a few banks with high connectivity or big balance sheets. Networks with heterogeneous degree distributions are shown to be more resilient to contagion triggered by the failure of a random bank, but more fragile with respect to contagion triggered by the failure of highly connected nodes. A power law distribution of balance sheet size is shown to induce an inefficient diversification that makes the system more prone to contagion events. A targeted policy aimed at reinforcing the stability of the biggest banks is shown to improve the stability of the system in the regime of high average degree. Finally, disassortative mixing, such as that observed in real banking networks, is shown to enhance the stability of the system.
    09/2011;
  • Source
    Bence Toth, Imon Palit, Fabrizio Lillo, J. Doyne Farmer
    [Show abstract] [Hide abstract]
    ABSTRACT: Equity order flow is persistent in the sense that buy orders tend to be followed by buy orders and sell orders tend to be followed by sell orders. For equity order flow this persistence is extremely long-ranged, with positive correlations spanning thousands of orders, over time intervals of up to several days. Such persistence in supply and demand is economically important because it influences the market impact as a function of both time and size and because it indicates that the market is in a sense out of equilibrium. Persistence can be caused by two types of behavior: (1) Order splitting, in which a single investor repeatedly places an order of the same sign, or (2) herding, in which different investors place orders of the same sign. We develop a method to decompose the autocorrelation function into splitting and herding components and apply this to order flow data from the London Stock Exchange containing exchange membership identifiers. Members typically act as brokers for other investors, so that it is not clear whether patterns we observe in brokerage data also reflect patterns in the behavior of single investors. To address this problem we develop models for the distortion caused by brokerage and demonstrate that persistence in order flow is overwhelmingly due to order splitting by single investors. At longer time scales we observe that different investors' behavior is anti-correlated. We show that this is due to differences in the response to price-changing vs. non-price-changing market orders.
    08/2011;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We study a simple model for the evolution of the cost (or more generally the performance) of a technology or production process. The technology can be decomposed into n components, each of which interacts with a cluster of d - 1 other components. Innovation occurs through a series of trial-and-error events, each of which consists of randomly changing the cost of each component in a cluster, and accepting the changes only if the total cost of the cluster is lowered. We show that the relationship between the cost of the whole technology and the number of innovation attempts is asymptotically a power law, matching the functional form often observed for empirical data. The exponent α of the power law depends on the intrinsic difficulty of finding better components, and on what we term the design complexity: the more complex the design, the slower the rate of improvement. Letting d as defined above be the connectivity, in the special case in which the connectivity is constant, the design complexity is simply the connectivity. When the connectivity varies, bottlenecks can arise in which a few components limit progress. In this case the design complexity depends on the details of the design. The number of bottlenecks also determines whether progress is steady, or whether there are periods of stasis punctuated by occasional large changes. Our model connects the engineering properties of a design to historical studies of technology improvement.
    Proceedings of the National Academy of Sciences 05/2011; 108(22):9008-13. · 9.81 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We present an empirical study of the intertwined behaviour of members in a financial market. Exploiting a database where the broker that initiates an order book event can be identified, we decompose the correlation and response functions into contributions coming from different market participants and study how their behaviour is interconnected. We find evidence that (1) brokers are very heterogeneous in liquidity provision -- some are consistently liquidity providers while others are consistently liquidity takers. (2) The behaviour of brokers is strongly conditioned on the actions of {\it other} brokers. In contrast brokers are only weakly influenced by the impact of their own previous orders. (3) The total impact of market orders is the result of a subtle compensation between the same broker pushing the price in one direction and the liquidity provision of other brokers pushing it in the opposite direction. These results enforce the picture of market dynamics being the result of the competition between heterogeneous participants interacting to form a complicated market ecology.
    Quantitative Finance 04/2011; · 0.82 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We develop a theory for the market impact of large trading orders, which we call metaorders because they are typically split into small pieces and executed incrementally. Market impact is empirically observed to be a concave function of metaorder size, i.e., the impact per share of large metaorders is smaller than that of small metaorders. We formulate a stylized model of an algorithmic execution service and derive a fair pricing condition, which says that the average transaction price of the metaorder is equal to the price after trading is completed. We show that at equilibrium the distribution of trading volume adjusts to reflect information, and dictates the shape of the impact function. The resulting theory makes empirically testable predictions for the functional form of both the temporary and permanent components of market impact. Based on the commonly observed asymptotic distribution for the volume of large trades, it says that market impact should increase asymptotically roughly as the square root of metaorder size, with average permanent impact relaxing to about two thirds of peak impact.
    Quantitative Finance 02/2011; · 0.82 Impact Factor

Publication Stats

2k Citations
172.93 Total Impact Points

Institutions

  • 2013–2014
    • University of Oxford
      Oxford, England, United Kingdom
  • 1999–2013
    • Santa Fe Institute
      Santa Fe, New Mexico, United States
  • 2005–2012
    • Libera Universitá Internazionale degli Studi Sociali, Guido Carli
      Roma, Latium, Italy
  • 2010
    • California Institute of Technology
      Pasadena, California, United States
  • 2003–2010
    • Università degli studi di Palermo
      • Dipartimento di Fisica e Chimica
      Palermo, Sicily, Italy
  • 2009
    • ISI Foundation
      Torino, Piedmont, Italy
  • 2001
    • Yale University
      • Department of Economics
      New Haven, Connecticut, United States