Preprint

Radical Complexity

Authors:
Preprints and early-stage research may not have been peer reviewed yet.
To read the file of this research, you can request a copy directly from the author.

Abstract

This is an informal and sketchy review of six topical, somewhat unrelated subjects in quantitative finance: rough volatility models; random covariance matrix theory; copulas; crowded trades; high-frequency trading & market stability; and "radical complexity" & scenario based (macro)economics. Some open questions and research directions are briefly discussed.

No file available

Request Full-text Paper PDF

To read the file of this research,
you can request a copy directly from the author.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
We discuss the impact of a Covid-19–like shock on a simple model economy, described by the previously developed Mark-0 Agent-Based Model. We consider a mixed supply and demand shock, and show that depending on the shock parameters (amplitude and duration), our model economy can display V-shaped, U-shaped or W-shaped recoveries, and even an L-shaped output curve with permanent output loss. This is due to the economy getting trapped in a self-sustained “bad” state. We then discuss two policies that attempt to moderate the impact of the shock: giving easy credit to firms, and the so-called helicopter money, i.e. injecting new money into the households savings. We find that both policies are effective if strong enough. We highlight the potential danger of terminating these policies too early, although inflation is substantially increased by lax access to credit. Finally, we consider the impact of a second lockdown. While we only discuss a limited number of scenarios, our model is flexible and versatile enough to accommodate a wide variety of situations, thus serving as a useful exploratory tool for a qualitative, scenario-based understanding of post-Covid recovery. The corresponding code is available on-line.
Article
Full-text available
This paper is devoted to the important yet unexplored subject of crowding effects on market impact, that we call ‘co-impact’. Our analysis is based on a large database of metaorders by institutional investors in the U.S. equity market. We find that the market chiefly reacts to the net order flow of ongoing metaorders, without individually distinguishing them. The joint co-impact of multiple contemporaneous metaorders depends on the total number of metaorders and their mutual sign correlation. Using a simple heuristic model calibrated on data, we reproduce very well the different regimes of the empirical market impact curves as a function of volume fraction φ: square-root for large φ, linear for intermediate φ, and a finite intercept I0 when φ→0. The value of I0 grows with the sign correlation coefficient. Our study sheds light on an apparent paradox: How can a non-linear impact law survive in the presence of a large number of simultaneously executed metaorders?
Article
Full-text available
We introduce and establish the main properties of QHawkes ("Quadratic" Hawkes) models. QHawkes models generalize the Hawkes price models introduced in E. Bacry et al. (2014), by allowing all feedback effects in the jump intensity that are linear and quadratic in past returns. A non-parametric fit on NYSE stock data shows that the off-diagonal component of the quadratic kernel indeed has a structure that standard Hawkes models fail to reproduce. Our model exhibits two main properties, that we believe are crucial in the modelling and the understanding of the volatility process: first, the model is time-reversal asymmetric, similar to financial markets whose time evolution has a preferred direction. Second, it generates a multiplicative, fat-tailed volatility process, that we characterize in detail in the case of exponentially decaying kernels, and which is linked to Pearson diffusions in the continuous limit. Several other interesting properties of QHawkes processes are discussed, in particular the fact that they can generate long memory without necessarily be at the critical point. Finally, we provide numerical simulations of our calibrated QHawkes model, which is indeed seen to reproduce, with only a small amount of quadratic non-linearity, the correct magnitude of fat-tails and time reversal asymmetry seen in empirical time series.
Article
Full-text available
We investigate the asymptotic behavior as time goes to infinity of Hawkes processes whose regression kernel has L1L^1 norm close to one and power law tail of the form x(1+α)x^{-(1+\alpha)}, with α(0,1)\alpha\in(0,1). We in particular prove that when α(1/2,1)\alpha\in(1/2,1), after suitable rescaling, their law converges to that of a kind of integrated fractional Cox-Ingersoll-Ross process, with associated Hurst parameter H=α1/2H=\alpha-1/2. This result is in contrast to the case of a regression kernel with light tail, where a classical Brownian CIR process is obtained at the limit. Interestingly, it shows that persistence properties in the point process can lead to an irregular behavior of the limiting process. This theoretical result enables us to give an agent-based foundation to some recent findings about the rough nature of volatility in financial markets.
Article
Full-text available
In this paper we propose an overview of the recent academic literature devoted to the applications of Hawkes processes in finance. Hawkes processes constitute a particular class of multivariate point processes that has become very popular in empirical high frequency finance this last decade. After a reminder of the main definitions and properties that characterize Hawkes processes, we review their main empirical applications to address many different problems in high frequency finance. Because of their great flexibility and versatility, we show that they have been successfully involved in issues as diverse as estimating the volatility at the level of transaction data, estimating the market stability, accounting for systemic risk contagion, devising optimal execution strategies or capturing the dynamics of the full order book.
Article
Full-text available
We present extensive evidence that "risk premium" is strongly correlated with tail-risk skewness but very little with volatility. We introduce a new, intuitive definition of skewness and elicit a linear relation between the Sharpe ratio of various risk premium strategies (Equity, Fama-French, FX Carry, Short Vol, Bonds, Credit) and their negative skewness. We find a clear exception to this rule: trend following (and perhaps the Fama-French "High minus Low"), that has positive skewness and positive excess returns, suggesting that some strategies are not risk premia but genuine market anomalies. Based on our results, we propose an objective criterion to assess the quality of a risk-premium portfolio.
Article
Full-text available
We revisit the Kolmogorov–Smirnov and Cramér–von Mises goodness-of-fit (GoF) tests and propose a generalization to identically distributed, but dependent univariate random variables. We show that the dependence leads to a reduction of the 'effective' number of independent observations. The generalized GoF tests are not distribution-free but rather depend on all the lagged bivariate copulas. These objects, that we call 'self-copulas', encode all the non-linear temporal dependences. We introduce a specific, log-normal model for these self-copulas, for which a number of analytical results are derived. An application to financial time series is provided. As is well known, the dependence is to be long-ranged in this case, a finding that we confirm using self-copulas. As a consequence, the acceptance rates for GoF tests are substantially higher than if the returns were iid random variables.
Article
Full-text available
We consider sample covariance matrices SN=1pΣN1/2XNXNΣN1/2{S_N=\frac{1}{p}\Sigma_N^{1/2}X_NX_N^* \Sigma_N^{1/2}} where X N is a N × p real or complex matrix with i.i.d. entries with finite 12th moment and ΣN is a N × N positive definite matrix. In addition we assume that the spectral measure of ΣN almost surely converges to some limiting probability distribution as N → ∞ and p/N → γ > 0. We quantify the relationship between sample and population eigenvectors by studying the asymptotics of functionals of the type 1NTr(g(ΣN)(SNzI)1),{\frac{1}{N}\text{Tr} ( g(\Sigma_N) (S_N-zI)^{-1}),} where I is the identity matrix, g is a bounded function and z is a complex number. This is then used to compute the asymptotically optimal bias correction for sample eigenvalues, paving the way for a new generation of improved estimators of the covariance matrix and its inverse.
Article
Full-text available
Linking microscopic and macroscopic behavior is at the heart of many natural and social sciences. This apparent similarity conceals essential differences across disciplines: Although physical particles are assumed to optimize the global energy, economic agents maximize their own utility. Here, we solve exactly a Schelling-like segregation model, which interpolates continuously between cooperative and individual dynamics. We show that increasing the degree of cooperativity induces a qualitative transition from a segregated phase of low utility toward a mixed phase of high utility. By introducing a simple function that links the individual and global levels, we pave the way to a rigorous approach of a wide class of systems, where dynamics are governed by individual strategies.
Article
Full-text available
We present a theory of excess stock market volatility, in which market movements are due to trades by very large institutional investors in relatively illiquid markets. Such trades generate significant spikes in returns and volume, even in the absence of important news about fundamentals. We derive the optimal trading behavior of these investors, which allows us to provide a unified explanation for apparently disconnected empirical regularities in returns, trading volume and investor size. © 2006 by the President and Fellows of Harvard College and the Massachusetts Institute of Technology.
Article
Full-text available
We show that the cost of market orders and the profit of infinitesimal market-making or -taking strategies can be expressed in terms of directly observable quantities, namely the spread and the lag-dependent impact function. Imposing that any market taking or liquidity providing strategies is at best marginally profitable, we obtain a linear relation between the bid-ask spread and the instantaneous impact of market orders, in good agreement with our empirical observations on electronic markets. We then use this relation to justify a strong, and hitherto unnoticed, empirical correlation between the spread and the volatility per trade, with R2s exceeding 0.9. This correlation suggests both that the main determinant of the bid-ask spread is adverse selection, and that most of the volatility comes from trade impact. We argue that the role of the time-horizon appearing in the definition of costs is crucial and that long-range correlations in the order flow, overlooked in previous studies, must be carefully factored in. We find that the spread is significantly larger on the NYSE, a liquid market with specialists, where monopoly rents appear to be present.
Article
Full-text available
In order to understand the origin of stock price jumps, we cross-correlate high-frequency time series of stock returns with different news feeds. We find that neither idiosyncratic news nor market wide news can explain the frequency and amplitude of price jumps. We find that the volatility patterns around jumps and around news are quite different: jumps are followed by increased volatility, whereas news tend on average to be followed by lower volatility levels. The shape of the volatility relaxation is also markedly different in the two cases. Finally, we provide direct evidence that large transaction volumes are_not_ responsible for large price jumps. We conjecture that most price jumps are induced by order flow fluctuations close to the point of vanishing liquidity.
Article
Full-text available
In this paper, we provide a simple, ``generic'' interpretation of multifractal scaling laws and multiplicative cascade process paradigms in terms of volatility correlations. We show that in this context 1/f power spectra, as observed recently by Bonanno et al., naturally emerge. We then propose a simple solvable ``stochastic volatility'' model for return fluctuations. This model is able to reproduce most of recent empirical findings concerning financial time series: no correlation between price variations, long-range volatility correlations and multifractal statistics. Moreover, its extension to a multivariate context, in order to model portfolio behavior, is very natural. Comparisons to real data and other models proposed elsewhere are provided. Comment: 21 pages, 5 figures
Article
Building log-normal-like rough volatility models with proper Zumbach effect using a microstructural approach
Book
The real world is perceived and broken down as data, models and algorithms in the eyes of physicists and engineers. Data is noisy by nature and classical statistical tools have so far been successful in dealing with relatively smaller levels of randomness. The recent emergence of Big Data and the required computing power to analyse them have rendered classical tools outdated and insufficient. Tools such as random matrix theory and the study of large sample covariance matrices can efficiently process these big data sets and help make sense of modern, deep learning algorithms. Presenting an introductory calculus course for random matrices, the book focusses on modern concepts in matrix theory, generalising the standard concept of probabilistic independence to non-commuting random variables. Concretely worked out examples and applications to financial engineering and portfolio construction make this unique book an essential tool for physicists, engineers, data analysts, and economists.
Book
The widespread availability of high-quality, high-frequency data has revolutionised the study of financial markets. By describing not only asset prices, but also market participants' actions and interactions, this wealth of information offers a new window into the inner workings of the financial ecosystem. In this original text, the authors discuss empirical facts of financial markets and introduce a wide range of models, from the micro-scale mechanics of individual order arrivals to the emergent, macro-scale issues of market stability. Throughout this journey, data is king. All discussions are firmly rooted in the empirical behaviour of real stocks, and all models are calibrated and evaluated using recent data from Nasdaq. By confronting theory with empirical facts, this book for practitioners, researchers and advanced students provides a fresh, new, and often surprising perspective on topics as diverse as optimal trading, price impact, the fragile nature of liquidity, and even the reasons why people trade at all. © Jean-Philippe Bouchaud, Julius Bonart, Jonathan Donier and Martin Gould 2018.
Article
The aim of our work is to propose a natural framework to account for all the empirically known properties of the multivariate distribution of stock returns. We define and study a "nested factor model", where the linear factors part is standard, but where the log-volatility of the linear factors and of the residuals are themselves endowed with a factor structure and residuals. We propose a calibration procedure to estimate these log-vol factors and the residuals. We find that whereas the number of relevant linear factors is relatively large (10 or more), only two or three log-vol factors emerge in our analysis of the data. In fact, a minimal model where only one log-vol factor is considered is already very satisfactory, as it accurately reproduces the properties of bivariate copulas, in particular the dependence of the medial-point on the linear correlation coefficient, as reported in Chicheportiche and Bouchaud (2012). We have tested the ability of the model to predict Out-of-Sample the risk of non-linear portfolios, and found that it performs significantly better than other schemes.
Article
The aim of this work is to explore the possible types of phenomena that simple macroeconomic Agent-Based models (ABM) can reproduce. Our motivation is to understand the large macro-economic fluctuations observed in the "Mark I" ABM devised by D. Delli Gatti and collaborators. Our major finding is the existence of a first order (discontinuous) phase transition between a "good economy" where unemployment is low, and a "bad economy" where unemployment is high. We show that this transition is robust against many modifications of the model, and is induced by an asymmetry between the rate of hiring and the rate of firing of the firms. This asymmetry is induced, in Mark I, by the interest rate. As the interest rate increases, the firms become more and more reluctant to take further loans. The unemployment level remains small until a tipping point beyond which the economy suddenly collapses. If the parameters are such that the system is close to this transition, any small fluctuation is amplified as the system jumps between the two equilibria. We have also explored several natural extensions. One is to allow this hiring/firing propensity to depend on the financial fragility of firms. Quite interestingly, we find that in this case, the above transition survives but becomes second order. We also studied simple wage policies and confidence feedback effects, whereby higher unemployment increases the saving propensity of households. We observe several interesting effects, such as the appearance of acute endogenous crises, during which the unemployment rate shoots up before the economy recovers. We end the paper with general comments on the usefulness of ABMs to model macroeconomic phenomena, in particular in view of the time needed to reach a steady state.
Article
We model the arrival of mid-price changes in the E-mini S&P futures contract as a self-exciting Hawkes process. Using several estimation methods, we find that the Hawkes kernel is power-law with a decay exponent close to −1.15 at short times, less than ≈ 103 s, and crosses over to a second power-law regime with a larger decay exponent ≈–1.45 for longer times scales in the range [103 ,106 ] seconds. More importantly, we find that the Hawkes kernel integrates to unity independently of the analysed period, from 1998 to 2011. This suggests that markets are and have always been close to criticality, challenging a recent study which indicates that reflexivity (endogeneity) has increased in recent years as a result of increased automation of trading. However, we note that the scale over which market events are correlated has decreased steadily over time with the emergence of higher frequency trading.
Article
We study a generic model for self-referential behaviour in financial markets where agents build strategies using correlations estimated using the past history itself, between certain quantitative information and the price. The impact of these strategies on the price modify the observed correlations and create a feedback loop that can destabilize the market from efficient behaviour. For large enough feedbacks non-trivial correlations spontaneously set in and the market switches between two long lived states, that we call conventions. This mechanism leads to overreaction and excess volatility. We provide empirical evidence for the existence of such long lasting anomalous correlations in real markets.
Article
We revisit the index leverage effect, that can be decomposed into a volatility effect and a correlation effect. We investigate the latter using a matrix regression analysis, that we call `Principal Regression Analysis' (PRA) and for which we provide some analytical (using Random Matrix Theory) and numerical benchmarks. We find that downward index trends increase the average correlation between stocks (as measured by the most negative eigenvalue of the conditional correlation matrix), and makes the market mode more uniform. Upward trends, on the other hand, also increase the average correlation between stocks but rotates the corresponding market mode {\it away} from uniformity. There are two time scales associated to these effects, a short one on the order of a month (20 trading days), and a longer time scale on the order of a year. We also find indications of a leverage effect for sectorial correlations as well, which reveals itself in the second and third mode of the PRA.
Article
Time reversal invariance can be summarized as follows: no difference can be measured if a sequence of events is run forward or backward in time. Because price time series are dominated by a randomness that hides possible structures and orders, the existence of time reversal invariance requires care to be investigated. Different statistics are constructed with the property to be zero for time series which are time reversal invariant; they all show that high-frequency empirical foreign exchange prices are not invariant. The same statistics are applied to mathematical processes that should mimic empirical prices. Monte Carlo simulations show that only some ARCH processes with a multi-timescales structure can reproduce the empirical findings. A GARCH(1,1) process can only reproduce some asymmetry. On the other hand, all the stochastic volatility type processes are time reversal invariant. This clear difference related to the process structures gives some strong selection criterion for processes.
Cleaning Correlation Matrices
  • J Bun
  • J P Bouchaud
  • M Potters
Bun, J., Bouchaud, J.P., & Potters, M. (2016). Cleaning Correlation Matrices, https://www.risk.net/risk-magazine/technical-paper/2452666