J. Doyne Farmer

University of Oxford, Oxford, England, United Kingdom

Are you J. Doyne Farmer?

Claim your profile

Publications (132)404.76 Total impact

  • J. Doyne Farmer · Cameron Hepburn · Penny Mealy · Alexander Teytelboym ·
    [Show abstract] [Hide abstract]
    ABSTRACT: Modelling the economics of climate change is daunting. Many existing methodologies from social and physical sciences need to be deployed, and new modelling techniques and ideas still need to be developed. Existing bread-and-butter micro- and macroeconomic tools, such as the expected utility framework, market equilibrium concepts and representative agent assumptions, are far from adequate. Four key issues—along with several others—remain inadequately addressed by economic models of climate change, namely: (1) uncertainty, (2) aggregation, heterogeneity and distributional implications (3) technological change, and most of all, (4) realistic damage functions for the economic impact of the physical consequences of climate change. This paper assesses the main shortcomings of two generations of climate-energy-economic models and proposes that a new wave of models need to be developed to tackle these four challenges. This paper then examines two potential candidate approaches—dynamic stochastic general equilibrium (DSGE) models and agent-based models (ABM). The successful use of agent-based models in other areas, such as in modelling the financial system, housing markets and technological progress suggests its potential applicability to better modelling the economics of climate change.
    Environmental and Resource Economics 10/2015; 62(2). DOI:10.1007/s10640-015-9965-2 · 1.52 Impact Factor
  • Source
    Christoph Aymanns · Fabio Caccioli · J. Doyne Farmer · Vincent W. C. Tan ·
    [Show abstract] [Hide abstract]
    ABSTRACT: Effective risk control must make a tradeoff between the microprudential risk of exogenous shocks to individual institutions and the macroprudential risks caused by their systemic interactions. We investigate a simple dynamical model for understanding this tradeoff, consisting of a bank with a leverage target and an unleveraged fundamental investor subject to exogenous noise with clustered volatility. The parameter space has three regions: (i) a stable region, where the system always reaches a fixed point equilibrium; (ii) a locally unstable region, characterized by cycles and chaotic behavior; and (iii) a globally unstable region. A crude calibration of parameters to data puts the model in region (ii). In this region there is a slowly building price bubble, resembling a "Great Moderation", followed by a crash, with a period of approximately 10-15 years, which we dub the "Basel leverage cycle". We propose a criterion for rating macroprudential policies based on their ability to minimize risk for a given average leverage. We construct a one parameter family of leverage policies that allows us to vary from the procyclical policies of Basel II or III, in which leverage decreases when volatility increases, to countercyclical policies in which leverage increases when volatility increases. We find the best policy depends critically on three parameters: The average leverage used by the bank; the relative size of the bank and the fundamentalist, and the amplitude of the exogenous noise. Basel II is optimal when the exogenous noise is high, the bank is small and leverage is low; in the opposite limit where the bank is large or leverage is high the optimal policy is closer to constant leverage. We also find that systemic risk can be dramatically decreased by lowering the leverage target adjustment speed of the banks.
  • [Show abstract] [Hide abstract]
    ABSTRACT: In November, 2011, the Financial Stability Board, in collaboration with the International Monetary Fund, published a list of 29 "systemically important financial institutions" (SIFIs). This designation reflects a concern that the failure of any one of them could have dramatic negative consequences for the global economy and is based on "their size, complexity, and systemic interconnectedness". While the characteristics of "size" and "systemic interconnectedness" have been the subject of a good deal of quantitative analysis, less attention has been paid to measures of a firm's "complexity." In this paper we take on the challenges of measuring the complexity of a financial institution and to that end explore the use of the structure of an individual firm's control hierarchy as a proxy for institutional complexity. The control hierarchy is a network representation of the institution and its subsidiaries. We show that this mathematical representation (and various associated metrics) provides a consistent way to compare the complexity of firms with often very disparate business models and as such may provide the foundation for determining a SIFI designation. By quantifying the level of complexity of a firm, our approach also may prove useful should firms need to reduce their level of complexity either in response to business or regulatory needs. Using a data set containing the control hierarchies of many of the designated SIFIs, we find that in the past two years, these firms have decreased their level of complexity, perhaps in response to regulatory requirements.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We analyze how to value future costs and benefits when they must be discounted relative to the present. We introduce the subject for the nonspecialist and take into account the randomness of the economic evolution by studying the discount function of three widely used processes for the dynamics of interest rates: Ornstein-Uhlenbeck, Feller, and log-normal. Besides obtaining exact expressions for the discount function and simple asymptotic approximations, we show that historical average interest rates overestimate long-run discount rates and that this effect can be large. In other words, long-run discount rates should be substantially less than the average rate observed in the past, otherwise any cost-benefit calculation would be biased in favor of the present and against interventions that may protect the future.
    Physical Review E 05/2015; 91(5). DOI:10.1103/PhysRevE.91.052816 · 2.29 Impact Factor
  • Source
    Elia Zarinelli · Michele Treccani · J. Doyne Farmer · Fabrizio Lillo ·
    [Show abstract] [Hide abstract]
    ABSTRACT: We make an extensive empirical study of the market impact of large orders (metaorders) executed in the U.S. equity market between 2007 and 2009. We show that the square root market impact formula, which is widely used in the industry and supported by previous published research, provides a good fit only across about two orders of magnitude in order size. A logarithmic functional form fits the data better, providing a good fit across almost five orders of magnitude. We introduce the concept of an "impact surface" to model the impact as a function of both the duration and the participation rate of the metaorder, finding again a logarithmic dependence. We show that during the execution the price trajectory deviates from the market impact, a clear indication of non-VWAP executions. Surprisingly, we find that sometimes the price starts reverting well before the end of the execution. Finally we show that, although on average the impact relaxes to approximately 2/3 of the peak impact, the precise asymptotic value of the price depends on the participation rate and on the duration of the metaorder. We present evidence that this might be due to a herding phenomenon among metaorders.
    12/2014; DOI:10.1142/S2382626615500045
  • Source
    Bence Tóth · Imon Palit · Fabrizio Lillo · J. Doyne Farmer ·
    [Show abstract] [Hide abstract]
    ABSTRACT: Order flow in equity markets is remarkably persistent in the sense that order signs (to buy or sell) are positively autocorrelated out to time lags of tens of thousands of orders, corresponding to many days. Two possible explanations are herding, corresponding to positive correlation in the behavior of different investors, or order splitting, corresponding to positive autocorrelation in the behavior of single investors. We investigate this using order flow data from the London Stock Exchange for which we have membership identifiers. By formulating models for herding and order splitting, as well as models for brokerage choice, we are able to overcome the distortion introduced by brokerage. On timescales of less than a few hours the persistence of order flow is overwhelmingly due to splitting rather than herding. We also study the properties of brokerage order flow and show that it is remarkably consistent both cross-sectionally and longitudinally.
    Journal of Economic Dynamics and Control 11/2014; 51. DOI:10.1016/j.jedc.2014.10.007 · 0.86 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The current model of economic growth generated unprecedented increases in human wealth and prosperity during the 19th and 20th centuries. The main mechanisms have been the rapid pace of technological and social innovation, human capital accumulation, and the conversion of resources and natural capital into more valuable forms of produced capital. However, there is evidence emerging that this model may be approaching environmental limits and planetary boundaries, and that the conversion of natural capital needs to slow down rapidly and then be reversed. Some commentators have asserted that in order for this to occur, we will need to stop growing altogether and, instead, seek prosperity without growth. Others argue that environmental concerns are low-priority luxuries to be contemplated once global growth has properly returned to levels observed prior to the 2008 financial crisis. A third group argues that there is no trade-off, and, instead, promotes green growth: the (politically appealing) idea is that we can simultaneously grow and address our environmental problems. This paper provides a critical perspective on this debate and suggests that a substantial research agenda is required to come to grips with these challenges. One place to start is with the relevant metrics: measures of per-capita wealth, and, eventually, quantitative measures of prosperity, alongside a dashboard of other sustainability indicators. A public and political focus on wealth (a stock), and its annual changes, could realistically complement the current focus on market-based gross output as measured by GDP (a flow). This could have important policy implications, but deeper changes to governance and business models will be required.
    China & World Economy 09/2014; 22(5). DOI:10.1111/j.1749-124X.2014.12085.x · 0.52 Impact Factor
  • Source
    Christoph Aymanns · J. Doyne Farmer ·
    [Show abstract] [Hide abstract]
    ABSTRACT: We present a simple agent-based model of a financial system composed of leveraged investors such as banks that invest in stocks and manage their risk using a Value-at-Risk constraint, based on historical observations of asset prices. The Value-at-Risk constraint implies that when perceived risk is low, leverage is high and vice versa, a phenomenon that has been dubbed pro-cyclical leverage. We show that this leads to endogenous irregular oscillations, in which gradual increases in stock prices and leverage are followed by drastic market collapses, i.e. a leverage cycle. This phenomenon is studied using simplified models that give a deeper understanding of the dynamics and the nature of the feedback loops and instabilities underlying the leverage cycle. We introduce a flexible leverage regulation policy in which it is possible to continuously tune from pro-cyclical to countercyclical leverage. When the policy is sufficiently countercyclical and bank risk is sufficiently low the endogenous oscillation disappears and prices go to a fixed point. While there is always a leverage ceiling above which the dynamics are unstable, countercyclical leverage can be used to raise the ceiling. Finally, we investigate fixed limits on leverage and show how they can control the leverage cycle.
    SSRN Electronic Journal 07/2014; 50. DOI:10.2139/ssrn.2468800
  • Source
    Peter Klimek · Sebastian Poledna · J. Doyne Farmer · Stefan Thurner ·
    [Show abstract] [Hide abstract]
    ABSTRACT: Since beginning of the 2008 financial crisis almost half a trillion euros have been spent to financially assist EU member states in taxpayer-funded bail-outs. These crisis resolutions are often accompanied by austerity programs causing political and social friction on both domestic and international levels. The question of how to resolve failing financial institutions under which economic preconditions is therefore a pressing and controversial issue of vast political importance. In this work we employ an agent-based model to study the economic and financial ramifications of three highly relevant crisis resolution mechanisms. To establish the validity of the model we show that it reproduces a series of key stylized facts if the financial and real economy. The distressed institution can either be closed via a purchase & assumption transaction, it can be bailed-out using taxpayer money, or it may be bailed-in in a debt-to-equity conversion. We find that for an economy characterized by low unemployment and high productivity the optimal crisis resolution with respect to financial stability and economic productivity is to close the distressed institution. For economies in recession with high unemployment the bail-in tool provides the most efficient crisis resolution mechanism. Under no circumstances do taxpayer-funded bail-out schemes outperform bail-ins with private sector involvement.
    Journal of Economic Dynamics and Control 03/2014; 50. DOI:10.1016/j.jedc.2014.08.020 · 0.86 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: If the historical average annual real interest rate is m > 0, and if the world is stationary, should consumption in the distant future be discounted at the rate of m per year? Suppose the annual real interest rate r(t) reverts to m according to the Ornstein Uhlenbeck (OU) continuous time process dr(t) = alpha[m - r(t)]dt kdw(t), where w is a standard Wiener process. Then we prove that the long run rate of interest is r_infinity = m-k^2/2alpha^2. This confirms the Weitzman-Gollier principle that the volatility and the persistence of interest rates lower long run discounting. We fit the OU model to historical data across 14 countries covering 87 to 318 years and estimate the average short rate m and the long run rate r_infinity for each country. The data corroborate that, when doing cost benefit analysis, the long run rate of discount should be taken to be substantially less than the average short run rate observed over a very long history.
    SSRN Electronic Journal 01/2014; DOI:10.2139/ssrn.2465953
  • Fabio Caccioli · J. Doyne Farmer · Nick Foti · Daniel Rockmore ·
    [Show abstract] [Hide abstract]
    ABSTRACT: We study the problem of interacting channels of contagion in financial networks. The first channel of contagion is counterparty failure risk; this is captured empirically using data for the Austrian interbank network. The second channel of contagion is overlapping portfolio exposures; this is studied using a stylized model. We perform stress tests according to different protocols. For the parameters we study neither channel of contagion results in large effects on its own. In contrast, when both channels are active at once, bankruptcies are much more common and have large systemic effects.
    Journal of Economic Dynamics and Control 01/2014; 51. DOI:10.1016/j.jedc.2014.09.041 · 0.86 Impact Factor
  • J. Doyne Farmer ·
    [Show abstract] [Hide abstract]
    ABSTRACT: Although it is often said that economics is too much like physics, to a physicist economics is not at all like physics. The difference is in the scientific methods of the two fields: theoretical economics uses a top down approach in which hypothesis and mathematical rigor come first and empirical confirmation comes second. Physics, in contrast, embraces the bottom up ‘experimental philosophy’ of Newton, in which ‘hypotheses are inferred from phenomena, and afterward rendered general by induction’. Progress would accelerates if economics were to truly make empirical verification the ultimate arbiter of theories, which would force it to open up to alternative approaches.
    Journal of Economic Methodology 12/2013; 20(4). DOI:10.1080/1350178X.2013.859408
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: For environmental problems such as global warming future costs must be balanced against present costs. This is traditionally done using an exponential function with a constant discount rate, which reduces the present value of future costs. The result is highly sensitive to the choice of discount rate and has generated a major controversy as to the urgency for immediate action. We study analytically several standard interest rate models from finance and compare their properties to empirical data. From historical time series for nominal interest rates and inflation covering 14 countries over hundreds of years, we find that extended periods of negative real interest rates are common, occurring in many epochs in all countries. This leads us to choose the Ornstein-Uhlenbeck model, in which real short run interest rates fluctuate stochastically and can become negative, even if they revert to a positive mean value. We solve the model in closed form and prove that the long-run discount rate is always less than the mean; indeed it can be zero or even negative, despite the fact that the mean short term interest rate is positive. We fit the parameters of the model to the data, and find that nine of the countries have positive long run discount rates while five have negative long-run discount rates. Even if one rejects the countries where hyperinflation has occurred, our results support the low discounting rate used in the Stern report over higher rates advocated by others.
    SSRN Electronic Journal 11/2013; DOI:10.2139/ssrn.2355737
  • Source
    Fabio Caccioli · J. Doyne Farmer · Nick Foti · Daniel Rockmore ·
    [Show abstract] [Hide abstract]
    ABSTRACT: In spite of the growing theoretical literature on cascades of failures in interbank lending networks, empirical results seem to suggest that networks of direct exposures are not the major channel of financial contagion. In this paper we show that networks of interbank exposures can however significantly amplify contagion due to overlapping portfolios. To illustrate this point, we consider the case of the Austrian interbank network and perform stress tests on it according to different protocols. We consider in particular contagion due to (i) counterparty loss; (ii) roll-over risk; and (iii) overlapping portfolios. We find that the average number of bankruptcies caused by counterparty loss and roll-over risk is fairly small if these contagion mechanisms are considered in isolation. Once portfolio overlaps are also accounted for, however, we observe that the network of direct interbank exposures significantly contributes to systemic risk.
  • J. Doyne Farmer · Spyros Skouras ·

    Quantitative Finance 03/2013; 13(3). DOI:10.1080/14697688.2012.757636 · 0.65 Impact Factor
  • Source
    Béla Nagy · J Doyne Farmer · Quan M Bui · Jessika E Trancik ·
    [Show abstract] [Hide abstract]
    ABSTRACT: Forecasting technological progress is of great interest to engineers, policy makers, and private investors. Several models have been proposed for predicting technological improvement, but how well do these models perform? An early hypothesis made by Theodore Wright in 1936 is that cost decreases as a power law of cumulative production. An alternative hypothesis is Moore's law, which can be generalized to say that technologies improve exponentially with time. Other alternatives were proposed by Goddard, Sinclair et al., and Nordhaus. These hypotheses have not previously been rigorously tested. Using a new database on the cost and production of 62 different technologies, which is the most expansive of its kind, we test the ability of six different postulated laws to predict future costs. Our approach involves hindcasting and developing a statistical model to rank the performance of the postulated laws. Wright's law produces the best forecasts, but Moore's law is not far behind. We discover a previously unobserved regularity that production tends to increase exponentially. A combination of an exponential decrease in cost and an exponential increase in production would make Moore's law and Wright's law indistinguishable, as originally pointed out by Sahal. We show for the first time that these regularities are observed in data to such a degree that the performance of these two laws is nearly the same. Our results show that technological progress is forecastable, with the square root of the logarithmic error growing linearly with the forecasting horizon at a typical rate of 2.5% per year. These results have implications for theories of technological change, and assessments of candidate technologies and policies for climate change mitigation.
    PLoS ONE 02/2013; 8(2):e52669. DOI:10.1371/journal.pone.0052669 · 3.23 Impact Factor
  • Source
    Sebastian Poledna · Stefan Thurner · J. Doyne Farmer · John Geanakoplos ·
    [Show abstract] [Hide abstract]
    ABSTRACT: We use a simple agent based model of value investors in financial markets to test three credit regulation policies. The first is the unregulated case, which only imposes limits on maximum leverage. The second is Basle II, which also imposes interest rate spreads on loans and haircuts on collateral, and the third is a hypothetical alternative in which banks perfectly hedge all of their leverage-induced risk with options that are paid for by the funds. When compared to the unregulated case both Basle II and the perfect hedge policy reduce the risk of default when leverage is low but increase it when leverage is high. This is because both regulation policies increase the amount of synchronized buying and selling needed to achieve deleveraging, which can destabilize the market. None of these policies are optimal for everyone: Risk neutral investors prefer the unregulated case with a maximum leverage of roughly four, banks prefer the perfect hedge policy, and fund managers prefer the unregulated case with a high maximum leverage. No one prefers Basle II.
    Journal of Banking & Finance 01/2013; 42(1). DOI:10.1016/j.jbankfin.2014.01.038 · 1.29 Impact Factor
  • Source
    Tobias Galla · J Doyne Farmer ·
    [Show abstract] [Hide abstract]
    ABSTRACT: Game theory is the standard tool used to model strategic interactions in evolutionary biology and social science. Traditionally, game theory studies the equilibria of simple games. However, is this useful if the game is complicated, and if not, what is? We define a complicated game as one with many possible moves, and therefore many possible payoffs conditional on those moves. We investigate two-person games in which the players learn based on a type of reinforcement learning called experience-weighted attraction (EWA). By generating games at random, we characterize the learning dynamics under EWA and show that there are three clearly separated regimes: (i) convergence to a unique fixed point, (ii) a huge multiplicity of stable fixed points, and (iii) chaotic behavior. In case (iii), the dimension of the chaotic attractors can be very high, implying that the learning dynamics are effectively random. In the chaotic regime, the total payoffs fluctuate intermittently, showing bursts of rapid change punctuated by periods of quiescence, with heavy tails similar to what is observed in fluid turbulence and financial markets. Our results suggest that, at least for some learning algorithms, there is a large parameter regime for which complicated strategic interactions generate inherently unpredictable behavior that is best described in the language of dynamical systems theory.
    Proceedings of the National Academy of Sciences 01/2013; 110(4). DOI:10.1073/pnas.1109672110 · 9.67 Impact Factor
  • Source
    J. Doyne Farmer · M. Gallegati · C. Hommes · A. Kirman · P. Ormerod · S. Cincotti · A. Sanchez · D. Helbing ·
    [Show abstract] [Hide abstract]
    ABSTRACT: We outline a vision for an ambitious program to understand the economy and financial markets as a complex evolving system of coupled networks of interacting agents. This is a completely different vision from that currently used in most economic models. This view implies new challenges and opportunities for policy and managing economic crises. The dynamics of such models inherently involve sudden and sometimes dramatic changes of state. Further, the tools and approaches we use emphasize the analysis of crises rather than of calm periods. In this they respond directly to the calls of Governors Bernanke and Trichet for new approaches to macroeconomic modelling. Graphical abstract
    The European Physical Journal Special Topics 11/2012; 214(1). DOI:10.1140/epjst/e2012-01696-9 · 1.40 Impact Factor
  • Source
    Fabio Caccioli · Munik K Shrestha · Cristopher Moore · J. Doyne Farmer ·
    [Show abstract] [Hide abstract]
    ABSTRACT: Common asset holdings are widely believed to have been the primary vector of contagion in the recent financial crisis. We develop a network approach to the amplification of financial contagion due to the combination of overlapping portfolios and leverage, and we show how it can be understood in terms of a generalized branching process. By studying a stylized model we estimate the circumstances under which systemic instabilities are likely to occur as a function of parameters such as leverage, market crowding, diversification, and market impact. Although diversification may be good for individual institutions, it can create dangerous systemic effects, and as a result financial contagion gets worse with too much diversification. Under our model there is a critical threshold for leverage; below it financial networks are always stable, and above it the unstable region grows as leverage increases. The financial system exhibits "robust yet fragile" behavior, with regions of the parameter space where contagion is rare but catastrophic whenever it occurs. Our model and methods of analysis can be calibrated to real data and provide simple yet powerful tools for macroprudential stress testing.
    Journal of Banking & Finance 10/2012; 46. DOI:10.2139/ssrn.2176080 · 1.29 Impact Factor

Publication Stats

9k Citations
404.76 Total Impact Points


  • 2011-2015
    • University of Oxford
      • Mathematical Institute
      Oxford, England, United Kingdom
  • 1990-2014
    • Santa Fe Institute
      Santa Fe, New Mexico, United States
  • 1983-2006
    • Los Alamos National Laboratory
      • • Theoretical Division
      • • Center for Nonlinear Studies
      Los Alamos, NM, United States
  • 1989
    • University of Michigan
      • Division of Computer Science and Engineering
      Ann Arbor, Michigan, United States
  • 1986
    • University of Illinois, Urbana-Champaign
      Urbana, Illinois, United States