Publications (137)428.56 Total impact
 [Show abstract] [Hide abstract]
ABSTRACT: Modelling the economics of climate change is daunting. Many existing methodologies from social and physical sciences need to be deployed, and new modelling techniques and ideas still need to be developed. Existing breadandbutter micro and macroeconomic tools, such as the expected utility framework, market equilibrium concepts and representative agent assumptions, are far from adequate. Four key issues—along with several others—remain inadequately addressed by economic models of climate change, namely: (1) uncertainty, (2) aggregation, heterogeneity and distributional implications (3) technological change, and most of all, (4) realistic damage functions for the economic impact of the physical consequences of climate change. This paper assesses the main shortcomings of two generations of climateenergyeconomic models and proposes that a new wave of models need to be developed to tackle these four challenges. This paper then examines two potential candidate approaches—dynamic stochastic general equilibrium (DSGE) models and agentbased models (ABM). The successful use of agentbased models in other areas, such as in modelling the financial system, housing markets and technological progress suggests its potential applicability to better modelling the economics of climate change. 
Article: Taming the Basel Leverage Cycle
[Show abstract] [Hide abstract]
ABSTRACT: Effective risk control must make a tradeoff between the microprudential risk of exogenous shocks to individual institutions and the macroprudential risks caused by their systemic interactions. We investigate a simple dynamical model for understanding this tradeoff, consisting of a bank with a leverage target and an unleveraged fundamental investor subject to exogenous noise with clustered volatility. The parameter space has three regions: (i) a stable region, where the system always reaches a fixed point equilibrium; (ii) a locally unstable region, characterized by cycles and chaotic behavior; and (iii) a globally unstable region. A crude calibration of parameters to data puts the model in region (ii). In this region there is a slowly building price bubble, resembling a "Great Moderation", followed by a crash, with a period of approximately 1015 years, which we dub the "Basel leverage cycle". We propose a criterion for rating macroprudential policies based on their ability to minimize risk for a given average leverage. We construct a one parameter family of leverage policies that allows us to vary from the procyclical policies of Basel II or III, in which leverage decreases when volatility increases, to countercyclical policies in which leverage increases when volatility increases. We find the best policy depends critically on three parameters: The average leverage used by the bank; the relative size of the bank and the fundamentalist, and the amplitude of the exogenous noise. Basel II is optimal when the exogenous noise is high, the bank is small and leverage is low; in the opposite limit where the bank is large or leverage is high the optimal policy is closer to constant leverage. We also find that systemic risk can be dramatically decreased by lowering the leverage target adjustment speed of the banks.  [Show abstract] [Hide abstract]
ABSTRACT: In November, 2011, the Financial Stability Board, in collaboration with the International Monetary Fund, published a list of 29 "systemically important financial institutions" (SIFIs). This designation reflects a concern that the failure of any one of them could have dramatic negative consequences for the global economy and is based on "their size, complexity, and systemic interconnectedness". While the characteristics of "size" and "systemic interconnectedness" have been the subject of a good deal of quantitative analysis, less attention has been paid to measures of a firm's "complexity." In this paper we take on the challenges of measuring the complexity of a financial institution and to that end explore the use of the structure of an individual firm's control hierarchy as a proxy for institutional complexity. The control hierarchy is a network representation of the institution and its subsidiaries. We show that this mathematical representation (and various associated metrics) provides a consistent way to compare the complexity of firms with often very disparate business models and as such may provide the foundation for determining a SIFI designation. By quantifying the level of complexity of a firm, our approach also may prove useful should firms need to reduce their level of complexity either in response to business or regulatory needs. Using a data set containing the control hierarchies of many of the designated SIFIs, we find that in the past two years, these firms have decreased their level of complexity, perhaps in response to regulatory requirements.  [Show abstract] [Hide abstract]
ABSTRACT: We analyze how to value future costs and benefits when they must be discounted relative to the present. We introduce the subject for the nonspecialist and take into account the randomness of the economic evolution by studying the discount function of three widely used processes for the dynamics of interest rates: OrnsteinUhlenbeck, Feller, and lognormal. Besides obtaining exact expressions for the discount function and simple asymptotic approximations, we show that historical average interest rates overestimate longrun discount rates and that this effect can be large. In other words, longrun discount rates should be substantially less than the average rate observed in the past, otherwise any costbenefit calculation would be biased in favor of the present and against interventions that may protect the future.  [Show abstract] [Hide abstract]
ABSTRACT: Recently it has become clear that many technologies follow a generalized version of Moore's law, i.e. costs tend to drop exponentially, at different rates that depend on the technology. Here we formulate Moore's law as a time series model and apply it to historical data on 53 technologies. Under the simple assumption of a correlated geometric random walk we derive a closed form expression approximating the distribution of forecast errors as a function of time. Based on hindcasting experiments we show that it is possible to collapse the forecast errors for many different technologies at many time horizons onto the same universal distribution. As a practical demonstration we make distributional forecasts at different time horizons for solar photovoltaic modules, and show how our method can be used to estimate the probability that a given technology will outperform another technology at a given point in the future.  [Show abstract] [Hide abstract]
ABSTRACT: We make an extensive empirical study of the market impact of large orders (metaorders) executed in the U.S. equity market between 2007 and 2009. We show that the square root market impact formula, which is widely used in the industry and supported by previous published research, provides a good fit only across about two orders of magnitude in order size. A logarithmic functional form fits the data better, providing a good fit across almost five orders of magnitude. We introduce the concept of an "impact surface" to model the impact as a function of both the duration and the participation rate of the metaorder, finding again a logarithmic dependence. We show that during the execution the price trajectory deviates from the market impact, a clear indication of nonVWAP executions. Surprisingly, we find that sometimes the price starts reverting well before the end of the execution. Finally we show that, although on average the impact relaxes to approximately 2/3 of the peak impact, the precise asymptotic value of the price depends on the participation rate and on the duration of the metaorder. We present evidence that this might be due to a herding phenomenon among metaorders. 
Article: Why is order flow so persistent?
[Show abstract] [Hide abstract]
ABSTRACT: Order flow in equity markets is remarkably persistent in the sense that order signs (to buy or sell) are positively autocorrelated out to time lags of tens of thousands of orders, corresponding to many days. Two possible explanations are herding, corresponding to positive correlation in the behavior of different investors, or order splitting, corresponding to positive autocorrelation in the behavior of single investors. We investigate this using order flow data from the London Stock Exchange for which we have membership identifiers. By formulating models for herding and order splitting, as well as models for brokerage choice, we are able to overcome the distortion introduced by brokerage. On timescales of less than a few hours the persistence of order flow is overwhelmingly due to splitting rather than herding. We also study the properties of brokerage order flow and show that it is remarkably consistent both crosssectionally and longitudinally.  [Show abstract] [Hide abstract]
ABSTRACT: The current model of economic growth generated unprecedented increases in human wealth and prosperity during the 19th and 20th centuries. The main mechanisms have been the rapid pace of technological and social innovation, human capital accumulation, and the conversion of resources and natural capital into more valuable forms of produced capital. However, there is evidence emerging that this model may be approaching environmental limits and planetary boundaries, and that the conversion of natural capital needs to slow down rapidly and then be reversed. Some commentators have asserted that in order for this to occur, we will need to stop growing altogether and, instead, seek prosperity without growth. Others argue that environmental concerns are lowpriority luxuries to be contemplated once global growth has properly returned to levels observed prior to the 2008 financial crisis. A third group argues that there is no tradeoff, and, instead, promotes green growth: the (politically appealing) idea is that we can simultaneously grow and address our environmental problems. This paper provides a critical perspective on this debate and suggests that a substantial research agenda is required to come to grips with these challenges. One place to start is with the relevant metrics: measures of percapita wealth, and, eventually, quantitative measures of prosperity, alongside a dashboard of other sustainability indicators. A public and political focus on wealth (a stock), and its annual changes, could realistically complement the current focus on marketbased gross output as measured by GDP (a flow). This could have important policy implications, but deeper changes to governance and business models will be required. 
Article: The dynamics of the leverage cycle
[Show abstract] [Hide abstract]
ABSTRACT: We present a simple agentbased model of a financial system composed of leveraged investors such as banks that invest in stocks and manage their risk using a ValueatRisk constraint, based on historical observations of asset prices. The ValueatRisk constraint implies that when perceived risk is low, leverage is high and vice versa, a phenomenon that has been dubbed procyclical leverage. We show that this leads to endogenous irregular oscillations, in which gradual increases in stock prices and leverage are followed by drastic market collapses, i.e. a leverage cycle. This phenomenon is studied using simplified models that give a deeper understanding of the dynamics and the nature of the feedback loops and instabilities underlying the leverage cycle. We introduce a flexible leverage regulation policy in which it is possible to continuously tune from procyclical to countercyclical leverage. When the policy is sufficiently countercyclical and bank risk is sufficiently low the endogenous oscillation disappears and prices go to a fixed point. While there is always a leverage ceiling above which the dynamics are unstable, countercyclical leverage can be used to raise the ceiling. Finally, we investigate fixed limits on leverage and show how they can control the leverage cycle.  [Show abstract] [Hide abstract]
ABSTRACT: Since beginning of the 2008 financial crisis almost half a trillion euros have been spent to financially assist EU member states in taxpayerfunded bailouts. These crisis resolutions are often accompanied by austerity programs causing political and social friction on both domestic and international levels. The question of how to resolve failing financial institutions under which economic preconditions is therefore a pressing and controversial issue of vast political importance. In this work we employ an agentbased model to study the economic and financial ramifications of three highly relevant crisis resolution mechanisms. To establish the validity of the model we show that it reproduces a series of key stylized facts if the financial and real economy. The distressed institution can either be closed via a purchase & assumption transaction, it can be bailedout using taxpayer money, or it may be bailedin in a debttoequity conversion. We find that for an economy characterized by low unemployment and high productivity the optimal crisis resolution with respect to financial stability and economic productivity is to close the distressed institution. For economies in recession with high unemployment the bailin tool provides the most efficient crisis resolution mechanism. Under no circumstances do taxpayerfunded bailout schemes outperform bailins with private sector involvement. 
Article: Discounting the Distant Future
[Show abstract] [Hide abstract]
ABSTRACT: If the historical average annual real interest rate is m > 0, and if the world is stationary, should consumption in the distant future be discounted at the rate of m per year? Suppose the annual real interest rate r(t) reverts to m according to the Ornstein Uhlenbeck (OU) continuous time process dr(t) = alpha[m  r(t)]dt kdw(t), where w is a standard Wiener process. Then we prove that the long run rate of interest is r_infinity = mk^2/2alpha^2. This confirms the WeitzmanGollier principle that the volatility and the persistence of interest rates lower long run discounting. We fit the OU model to historical data across 14 countries covering 87 to 318 years and estimate the average short rate m and the long run rate r_infinity for each country. The data corroborate that, when doing cost benefit analysis, the long run rate of discount should be taken to be substantially less than the average short run rate observed over a very long history.  [Show abstract] [Hide abstract]
ABSTRACT: We study the problem of interacting channels of contagion in financial networks. The first channel of contagion is counterparty failure risk; this is captured empirically using data for the Austrian interbank network. The second channel of contagion is overlapping portfolio exposures; this is studied using a stylized model. We perform stress tests according to different protocols. For the parameters we study neither channel of contagion results in large effects on its own. In contrast, when both channels are active at once, bankruptcies are much more common and have large systemic effects.  [Show abstract] [Hide abstract]
ABSTRACT: Although it is often said that economics is too much like physics, to a physicist economics is not at all like physics. The difference is in the scientific methods of the two fields: theoretical economics uses a top down approach in which hypothesis and mathematical rigor come first and empirical confirmation comes second. Physics, in contrast, embraces the bottom up ‘experimental philosophy’ of Newton, in which ‘hypotheses are inferred from phenomena, and afterward rendered general by induction’. Progress would accelerates if economics were to truly make empirical verification the ultimate arbiter of theories, which would force it to open up to alternative approaches.  [Show abstract] [Hide abstract]
ABSTRACT: For environmental problems such as global warming future costs must be balanced against present costs. This is traditionally done using an exponential function with a constant discount rate, which reduces the present value of future costs. The result is highly sensitive to the choice of discount rate and has generated a major controversy as to the urgency for immediate action. We study analytically several standard interest rate models from finance and compare their properties to empirical data. From historical time series for nominal interest rates and inflation covering 14 countries over hundreds of years, we find that extended periods of negative real interest rates are common, occurring in many epochs in all countries. This leads us to choose the OrnsteinUhlenbeck model, in which real short run interest rates fluctuate stochastically and can become negative, even if they revert to a positive mean value. We solve the model in closed form and prove that the longrun discount rate is always less than the mean; indeed it can be zero or even negative, despite the fact that the mean short term interest rate is positive. We fit the parameters of the model to the data, and find that nine of the countries have positive long run discount rates while five have negative longrun discount rates. Even if one rejects the countries where hyperinflation has occurred, our results support the low discounting rate used in the Stern report over higher rates advocated by others.  [Show abstract] [Hide abstract]
ABSTRACT: In spite of the growing theoretical literature on cascades of failures in interbank lending networks, empirical results seem to suggest that networks of direct exposures are not the major channel of financial contagion. In this paper we show that networks of interbank exposures can however significantly amplify contagion due to overlapping portfolios. To illustrate this point, we consider the case of the Austrian interbank network and perform stress tests on it according to different protocols. We consider in particular contagion due to (i) counterparty loss; (ii) rollover risk; and (iii) overlapping portfolios. We find that the average number of bankruptcies caused by counterparty loss and rollover risk is fairly small if these contagion mechanisms are considered in isolation. Once portfolio overlaps are also accounted for, however, we observe that the network of direct interbank exposures significantly contributes to systemic risk. 
 [Show abstract] [Hide abstract]
ABSTRACT: Forecasting technological progress is of great interest to engineers, policy makers, and private investors. Several models have been proposed for predicting technological improvement, but how well do these models perform? An early hypothesis made by Theodore Wright in 1936 is that cost decreases as a power law of cumulative production. An alternative hypothesis is Moore's law, which can be generalized to say that technologies improve exponentially with time. Other alternatives were proposed by Goddard, Sinclair et al., and Nordhaus. These hypotheses have not previously been rigorously tested. Using a new database on the cost and production of 62 different technologies, which is the most expansive of its kind, we test the ability of six different postulated laws to predict future costs. Our approach involves hindcasting and developing a statistical model to rank the performance of the postulated laws. Wright's law produces the best forecasts, but Moore's law is not far behind. We discover a previously unobserved regularity that production tends to increase exponentially. A combination of an exponential decrease in cost and an exponential increase in production would make Moore's law and Wright's law indistinguishable, as originally pointed out by Sahal. We show for the first time that these regularities are observed in data to such a degree that the performance of these two laws is nearly the same. Our results show that technological progress is forecastable, with the square root of the logarithmic error growing linearly with the forecasting horizon at a typical rate of 2.5% per year. These results have implications for theories of technological change, and assessments of candidate technologies and policies for climate change mitigation.  [Show abstract] [Hide abstract]
ABSTRACT: We use a simple agent based model of value investors in financial markets to test three credit regulation policies. The first is the unregulated case, which only imposes limits on maximum leverage. The second is Basle II, which also imposes interest rate spreads on loans and haircuts on collateral, and the third is a hypothetical alternative in which banks perfectly hedge all of their leverageinduced risk with options that are paid for by the funds. When compared to the unregulated case both Basle II and the perfect hedge policy reduce the risk of default when leverage is low but increase it when leverage is high. This is because both regulation policies increase the amount of synchronized buying and selling needed to achieve deleveraging, which can destabilize the market. None of these policies are optimal for everyone: Risk neutral investors prefer the unregulated case with a maximum leverage of roughly four, banks prefer the perfect hedge policy, and fund managers prefer the unregulated case with a high maximum leverage. No one prefers Basle II.  [Show abstract] [Hide abstract]
ABSTRACT: Game theory is the standard tool used to model strategic interactions in evolutionary biology and social science. Traditionally, game theory studies the equilibria of simple games. However, is this useful if the game is complicated, and if not, what is? We define a complicated game as one with many possible moves, and therefore many possible payoffs conditional on those moves. We investigate twoperson games in which the players learn based on a type of reinforcement learning called experienceweighted attraction (EWA). By generating games at random, we characterize the learning dynamics under EWA and show that there are three clearly separated regimes: (i) convergence to a unique fixed point, (ii) a huge multiplicity of stable fixed points, and (iii) chaotic behavior. In case (iii), the dimension of the chaotic attractors can be very high, implying that the learning dynamics are effectively random. In the chaotic regime, the total payoffs fluctuate intermittently, showing bursts of rapid change punctuated by periods of quiescence, with heavy tails similar to what is observed in fluid turbulence and financial markets. Our results suggest that, at least for some learning algorithms, there is a large parameter regime for which complicated strategic interactions generate inherently unpredictable behavior that is best described in the language of dynamical systems theory.  [Show abstract] [Hide abstract]
ABSTRACT: We outline a vision for an ambitious program to understand the economy and financial markets as a complex evolving system of coupled networks of interacting agents. This is a completely different vision from that currently used in most economic models. This view implies new challenges and opportunities for policy and managing economic crises. The dynamics of such models inherently involve sudden and sometimes dramatic changes of state. Further, the tools and approaches we use emphasize the analysis of crises rather than of calm periods. In this they respond directly to the calls of Governors Bernanke and Trichet for new approaches to macroeconomic modelling. Graphical abstract
Publication Stats
11k  Citations  
428.56  Total Impact Points  
Top Journals
Institutions

20112015

University of Oxford
 Mathematical Institute
Oxford, England, United Kingdom


19902014

Santa Fe Institute
Santa Fe, New Mexico, United States


19832006

Los Alamos National Laboratory
 • Theoretical Division
 • Center for Nonlinear Studies
Los Alamos, NM, United States


1989

University of Michigan
 Division of Computer Science and Engineering
Ann Arbor, Michigan, United States


1986

University of Illinois, UrbanaChampaign
Urbana, Illinois, United States
