## No full-text available

To read the full-text of this research,

you can request a copy directly from the author.

To read the full-text of this research,

you can request a copy directly from the author.

ResearchGate has not been able to resolve any citations for this publication.

The essay offers a new understanding of how financial markets work. The key departure from conventional theory is to recognize that investors do not invest directly in securities but through agents such as fund managers. Agents have better information and different objectives than their customers (principals) and this asymmetry is shown as the source of inefficiency: mispricing, bubbles and crashes. A separate outcome is that agents are in a position to capture for themselves the bulk of the returns from financial innovations. Principal/agent problems do a good job of explaining how the global finance sector has become so bloated, profitable and prone to crisis. Remedial action involves the principals changing the way they contract with, and instruct agents. The essay ends with a manifesto of policies that pension funds and other large investors can adopt to mitigate the destructive features of delegation both for their individual benefit and to promote social welfare in the form of a leaner, more efficient and more stable finance sector.

We use methods of random matrix theory to analyze the cross-correlation matrix C of stock price changes of the largest 1000 U.S. companies for the 2-year period 1994-1995. We find that the statistics of most of the eigenvalues in the spectrum of C agree with the predictions of random matrix theory, but there are deviations for a few of the largest eigenvalues. We find that C has the universal properties of the Gaussian orthogonal ensemble of random matrices. Furthermore, we analyze the eigenvectors of C through their inverse participation ratio and find eigenvectors with large ratios at both edges of the eigenvalue spectrum-a situation reminiscent of localization theory results.

One of the most influential ideas in the past 30 years is the Efficient Markets Hypothesis, the idea that market prices incorporate all information rationally and instantaneously. However, the emerging discipline of behavioral economics and finance has challenged this hypothesis, arguing that markets are not rational, but are driven by fear and greed instead. Recent research in the cognitive neurosciences suggests that these two perspectives are opposite sides of the same coin. In this article I propose a new framework that reconciles market efficiency with behavioral alternatives by applying the principles of evolution - competition, adaptation, and natural selection - to financial interactions. By extending Herbert Simon's notion of "satisficing" with evolutionary dynamics, I argue that much of what behavioralists cite as counterexamples to economic rationality - loss aversion, overconfidence, overreaction, mental accounting, and other behavioral biases - are, in fact, consistent with an evolutionary model of individuals adapting to a changing environment via simple heuristics. Despite the qualitative nature of this new paradigm, the Adaptive Markets Hypothesis offers a number of surprisingly concrete implications for the practice of portfolio management.

Throughout history, rich and poor countries alike have been lending, borrowing, crashing--and recovering--their way through an extraordinary range of financial crises. Each time, the experts have chimed, "this time is different"--claiming that the old rules of valuation no longer apply and that the new situation bears little similarity to past disasters. With this breakthrough study, leading economists Carmen Reinhart and Kenneth Rogoff definitively prove them wrong. Covering sixty-six countries across five continents, This Time Is Different presents a comprehensive look at the varieties of financial crises, and guides us through eight astonishing centuries of government defaults, banking panics, and inflationary spikes--from medieval currency debasements to today's subprime catastrophe. Carmen Reinhart and Kenneth Rogoff, leading economists whose work has been influential in the policy debate concerning the current financial crisis, provocatively argue that financial combustions are universal rites of passage for emerging and established market nations. The authors draw important lessons from history to show us how much--or how little--we have learned. Using clear, sharp analysis and comprehensive data, Reinhart and Rogoff document that financial fallouts occur in clusters and strike with surprisingly consistent frequency, duration, and ferocity. They examine the patterns of currency crashes, high and hyperinflation, and government defaults on international and domestic debts--as well as the cycles in housing and equity prices, capital flows, unemployment, and government revenues around these crises. While countries do weather their financial storms, Reinhart and Rogoff prove that short memories make it all too easy for crises to recur. An important book that will affect policy discussions for a long time to come, This Time Is Different exposes centuries of financial missteps.

In the run-up to the recent financial crisis, an increasingly elaborate set of financial instruments emerged, intended to optimize returns to individual institutions with seemingly minimal risk. Essentially no attention was given to their possible effects on the stability of the system as a whole. Drawing analogies with the dynamics of ecological food webs and with networks within which infectious diseases spread, we explore the interplay between complexity and stability in deliberately simplified models of financial networks. We suggest some policy lessons that can be drawn from such models, with the explicit aim of minimizing systemic risk.

This paper reports the findings of a survey and interviews with graduate students at seven top-ranking graduate economics programs. It finds that over the last 15 years graduate economics programs have become more empirical, less mathematical and less theoretically oriented, and that the students are generally positive about the profession. It also finds fewer differences among school. Despite the improvements, and greater student satisfaction, the paper suggests that there are serious pedagogical questions about the focus of the core on mathematical techniques rather than on creativity and economic reasoning, which students see as the true core of economics.

"Nowhere does history indulge in repetitions so often or so uniformly as in Wall Street," observed legendary speculator Jesse Livermore. History tells us that periods of major technological innovation are typically accompanied by speculative bubbles as economic agents overreact to genuine advancements in productivity. Excessive run-ups in asset prices can have important consequences for the economy as firms and investors respond to the price signals, resulting in capital misallocation. On the one hand, speculation can magnify the volatility of economic and financial variables, thus harming the welfare of those who are averse to uncertainty and fluctuations. But on the other hand, speculation can increase investment in risky ventures, thus yielding benefits to a society that suffers from an underinvestment problem.

Burton Malkiel’s 1973 A Random Walk Down Wall Street was an explosive contribution to debates about how to reap a good return on investing in stocks and shares. Reissued and updated many times since, Malkiel’s text remains an indispensable contribution to the world of investment strategy - one that continues to cause controversy among investment professionals today. At the book’s heart lies a simple question of evaluation: just how successful are investment experts? The financial world was, and is, full of people who claim to have the knowledge and expertise to outperform the markets, and produce larger gains for investors as a result of their knowledge. But how successful, Malkiel asked, are they really? Via careful evaluations of performance - looking at those who invested via ‘technical analysis’ and ‘fundamental analysis’ - he was able to challenge the adequacy of many of the claims made for analysts’ success. Malkiel found the major active investment strategies to be significantly flawed. Where actively managed funds posted big gains one year, they seemingly inevitably posted below average gains in succeeding years. By evaluating the figures over the medium and long term, indeed, Malkiel discovered that actively-managed funds did far worse on average than those that passively followed the general market index. Though many investment professionals still argue against Malkiel’s influential findings, his exploration of the strengths and weaknesses of the argument for believing investors’ claims provides strong evidence that his own passive strategy wins out overall.

This book was originally published by Macmillan in 1936. It was voted the top Academic Book that Shaped Modern Britain by Academic Book Week (UK) in 2017, and in 2011 was placed on Time Magazine's top 100 non-fiction books written in English since 1923. Reissued with a fresh Introduction by the Nobel-prize winner Paul Krugman and a new Afterword by Keynes’ biographer Robert Skidelsky, this important work is made available to a new generation.
The General Theory of Employment, Interest and Money transformed economics and changed the face of modern macroeconomics. Keynes’ argument is based on the idea that the level of employment is not determined by the price of labour, but by the spending of money. It gave way to an entirely new approach where employment, inflation and the market economy are concerned.
Highly provocative at its time of publication, this book and Keynes’ theories continue to remain the subject of much support and praise, criticism and debate. Economists at any stage in their career will enjoy revisiting this treatise and observing the relevance of Keynes’ work in today’s contemporary climate.

Professional investors are bombarded on a day to day basis with assertions about the role liquidity is playing and will play in determining prices in the financial markets. Few, if any, of the providers or recipients of such advice can truly claim to understand the well-springs of such liquidity and the transmission mechanisms through which it impacts asset prices. This groundbreaking new book explores the belief that at the core of liquidity there is a force which exerts individuals to effect a financial transaction when they would not otherwise do so. Understanding this force of compulsion is a key to understanding a financial market when it appears to be behaving irrationally. This book will enable new and seasoned investors to develop an understanding of the factors, so that costly mistakes can be avoided without the lesson of experience.

Many decisions are based on beliefs concerning the likelihood of uncertain events such as the outcome of an election, the guilt of a defendant, or the future value of the dollar. Occasionally, beliefs concerning uncertain events are expressed in numerical form as odds or subjective probabilities. In general, the heuristics are quite useful, but sometimes they lead to severe and systematic errors. The subjective assessment of probability resembles the subjective assessment of physical quantities such as distance or size. These judgments are all based on data of limited validity, which are processed according to heuristic rules. However, the reliance on this rule leads to systematic errors in the estimation of distance. This chapter describes three heuristics that are employed in making judgments under uncertainty. The first is representativeness, which is usually employed when people are asked to judge the probability that an object or event belongs to a class or event. The second is the availability of instances or scenarios, which is often employed when people are asked to assess the frequency of a class or the plausibility of a particular development, and the third is adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available.

A capital market is said to be efficient if it fully and correctly reflects all relevant information in determining security prices. Formally, the market is said to be efficient with respect to some information set, ϕ, if security prices would be unaffected by revealing that information to all participants. Moreover, efficiency with respect to an information set, ϕ, implies that it is impossible to make economic profits by trading on the basis of ϕ.

Portfolio volatility is the only source of risk in mean-variance optimality, but it fails to capture all the risksfaced by leveraged portfolios. These risks include the possibility of margin calls and forced liquidations at adverse prices and losses beyond the capital invested. To recognize these risks, the authors incorporated leverage aversion into the optimization process and examined the effects of volatility and leverage aversion on optimal long-short portfolios.

One of the most influential ideas in the past 30 years is the efficient markets hypothesis, the idea that market prices incorporate all information rationally and instantaneously. The emerging discipline of behavioral economics and finance has challenged the EMH, arguing that markets are not rational, but rather driven by fear and greed. Research in the cognitive neurosciences suggests these two perspectives are opposite sides of the same coin. An adaptive markets hypothesis that reconciles market efficiency with behavioral alternatives applies the principles of evolution-competition, adaptation, and natural selection-to financial interactions. Extending Simon's notion of "satisficing" with evolutionary dynamics, the author argues that much of what behaviorists cite as counter-examples to economic rationality-loss aversion, overconfidence, overreaction, mental accounting, and other behavioral biases-is in fact consistent with an evolutionary model of individual adaptation to a changing environment via simple heuristics. The adaptive markets hypothesis offers a number of surprisingly concrete implications for portfolio management.

Focardi and Fabozzi argue that current mainstream economics is not a science in the sense of the physical sciences, and they draw some conclusions from the point of view of asset management. Their key point is that economics as embodied in the general equilibrium theories describes an idealized rational economic world as opposed to one based on empirical data. Although this argument has already been made, it has been virtually ignored by economists.The current crisis, however, requires an economic understanding anchored on a solid empirical basis. The authors review a number of facts, including the following: 1) market efficiency is a quantitative concept, with efficiency defined in terms of the magnitude of realistic profit opportunities; 2) the dynamic vector-like nature of inflation challenges current theories about inflation and the generation of money, making growth path-dependent; 3) economic conservation laws are key to understanding growth; and 4) a market economy cannot support an unbounded level of wealth and income inequality because they become a destabilizing factor. The overall lesson for asset management is that economics matters and that the culture of pure speculation would be replaced profitably for society as a whole with a true culture of investment.

Scitation is the online home of leading journals and conference proceedings from AIP Publishing and AIP Member Societies

Refers to the preferability of an alternative under conditions of risk and is a matter of preferences. Risk-value models: Models of decision making under uncertainty assuming that the preference for an alternative is exclusively determined by its riskiness and its value or worth. Within such models, the decision problem is viewed as choosing among possible risk-value combinations where riskiness of each alternative is numerically represented by a risk measure. Alternative: Every action a decision maker may choose in a set of admissible actions characterizing a given decision problem. Perceived risk: The amount of risk attached to a given alternative according to the perception of an individual decision maker. This perception is determined by the amount of potential losses and its probability. Risk measure: A real-valued function numerically representing an individual decision maker's risk ordering on a given set of alternatives. It quantifies the amount of perceived risk. Risk ordering: An ordering which can be obtained directly by asking a decision maker to judge which of any pair of alternatives he perceives as riskier. This ordering need not be related to the decision maker's preference ordering in any simple way. Random variable: A function defined on a set of random events with real values which are them-selves regarded as random. In decision situations under risk, the possible outcomes of any alternative usually are regarded as a random variable with monetary values. Value-at-Risk: For a given time horizon and a confidence level 1 − α, the Value-at-Risk of a financial portfolio is the loss in market value over the time horizon that is exceeded by that portfolio only with probability α. Target outcome: An outcome such that every outcome smaller than the target outcome is viewed as undesirable or risky, while outcomes as large as the target outcome are desirable or non-risky. This target outcome may be the zero outcome, status quo, a certain aspiration level, as well as the best 1 result attainable in a certain situation. Gain: Any outcome that lies above a certain target return. Loss: Any outcome that falls below a certain target outcome. Variance: A classical risk measure quantifying the risk of an alternative by the mean square deviation of its potential outcomes from its mean outcome. Standard deviation: A classical risk measure quantifying the risk of an alternative by the square root of the mean square deviation of its potential outcomes from its mean outcome. Summary The concept of risk is essential to many problems in economics and business. Usually, risk is treated in the traditional expected utility framework where it is defined only indirectly through the shape of the utility function. The purpose of utility functions, however, is to model preferences. In this article, those approaches are reviewed which directly model risk judgements. After a review of standardized risk measures, recent theoretical developments of measures of perceived risk are presented.

This second edition presents the advances made in finance market analysis since 2005. The book provides a careful introduction to stochastic methods along with approximate ensembles for a single, historic time series. The new edition explains the history leading up to the biggest economic disaster of the 21st century. Empirical evidence for finance market instability under deregulation is given, together with a history of the explosion of the US Dollar worldwide. A model shows how bounds set by a central bank stabilized FX in the gold standard era, illustrating the effect of regulations. The book presents economic and finance theory thoroughly and critically, including rational expectations, cointegration and arch/garch methods, and replaces several of those misconceptions by empirically based ideas. This book will be of interest to finance theorists, traders, economists, physicists and engineers, and leads the reader to the frontier of research in time series analysis.

The concepts of portfolio optimization and diversification have been instrumental in the development and understanding of financial markets and financial decision making. In light of the 60 year anniversary of Harry Markowitz’s paper “Portfolio Selection,” we review some of the approaches developed to address the challenges encountered when using portfolio optimization in practice, including the inclusion of transaction costs, portfolio management constraints, and the sensitivity to the estimates of expected returns and covariances. In addition, we selectively highlight some of the new trends and developments in the area such as diversification methods, risk-parity portfolios, the mixing of different sources of alpha, and practical multi-period portfolio optimization.

ABSTRACT Two easily measured variables, size and book-to-market equity, combine to capture the cross-sectional variation in average stock returns associated with market {3, size, leverage, book-to-market equity, and earnings-price ratios. Moreover, when the tests allow for variation in {3 that is unrelated to size, the relation between market {3 and average return is flat, even when {3 is the only explanatory variable. THE ASSET-PRICING MODEL OF Sharpe (1964), Lintner (1965), and Black (1972)

We propose that portfolio theory and mean-variance optimization be augmented to incorporate investor aversion to leverage and suggest a specification for leverage aversion that captures the unique risks of leverage. We also introduce mean-variance-leverage efficient frontiers, compare them with conventional mean-variance efficient frontiers, and develop the mean-variance-leverage efficient region. Our analysis shows that leverage aversion can have a large impact on an investor’s portfolio choice.

On 16 November 2002, the first official case of Severe Acute Respiratory Syndrome (SARS) was recorded in Guangdong Province, China. Panic ensued. Uncertainty about its causes and contagious consequences brought many neighbouring economies across Asia to a standstill. Hotel occupancy rates in Hong Kong fell from over 80 % to less than 15 %, while among Beijing’s 5-star hotels occupancy rates fell below 2 %.

We argue that the present crisis and stalling economy continuing since 2007
are rooted in the delusionary belief in policies based on a "perpetual money
machine" type of thinking. We document strong evidence that, since the early
1980s, consumption has been increasingly funded by smaller savings, booming
financial profits, wealth extracted from house price appreciation and explosive
debt. This is in stark contrast with the productivity-fueled growth that was
seen in the 1950s and 1960s. This transition, starting in the early 1980s, was
further supported by a climate of deregulation and a massive growth in
financial derivatives designed to spread and diversify the risks globally. The
result has been a succession of bubbles and crashes, including the worldwide
stock market bubble and great crash of October 1987, the savings and loans
crisis of the 1980s, the burst in 1991 of the enormous Japanese real estate and
stock market bubbles, the emerging markets bubbles and crashes in 1994 and
1997, the LTCM crisis of 1998, the dotcom bubble bursting in 2000, the recent
house price bubbles, the financialization bubble via special investment
vehicles, the stock market bubble, the commodity and oil bubbles and the debt
bubbles, all developing jointly and feeding on each other. Rather than still
hoping that real wealth will come out of money creation, we need fundamentally
new ways of thinking. In uncertain times, it is essential, more than ever, to
think in scenarios: what can happen in the future, and, what would be the
effect on your wealth and capital? How can you protect against adverse
scenarios? We thus end by examining the question "what can we do?" from the
macro level, discussing the fundamental issue of incentives and of constructing
and predicting scenarios as well as developing investment insights.

The lessons learned from the recent financial crisis should significantly reshape the economics profession's thinking, including, importantly, what we teach our students. Five such lessons are that we live in a monetary economy and therefore aggregate demand and policies that affect aggregate demand are determinants of real economic outcomes; that what actually matters for this purpose is not money but the volume, availability, and price of credit; that the fact that most lending is done by financial institutions matters as well; that the prices set in our financial markets do not always exhibit the “rationality” economists normally claim for them; and that both frictions and the uneven impact of economic events prevent us from adapting to disturbances in the way textbook economics suggests.

This paper proposes to estimate the covariance matrix of stock returns by an optimally weighted average of two existing estimators: the sample covariance matrix and single-index covariance matrix. This method is generally known as shrinkage, and it is standard in decision theory and in empirical Bayesian statistics. Our shrinkage estimator can be seen as a way to account for extra-market covariance without having to specify an arbitrary multifactor structure. For NYSE and AMEX stock returns from 1972 to 1995, it can be used to select portfolios with significantly lower out-of-sample variance than a set of existing estimators, including multifactor models.

The General Theory of Employment, Interest, and Money / John Maynard Keynes Note: The University of Adelaide Library eBooks @ Adelaide.

We show that results from the theory of random matrices are potentially of great interest when trying to understand the statistical structure of the empirical correlation matrices appearing in the study of multivariate financial time series. We find a remarkable agreement between the theoretical prediction (based on the assumption that the correlation matrix is random) and empirical data concerning the density of eigenvalues associated to the time series of the different stocks of the S&P500 (or other major markets). Finally, we give a specific example to show how this idea can be successfully implemented for improving risk management.

If options are correctly priced in the market, it should not be possible to make sure profits by creating portfolios of long and short positions in options and their underlying stocks. Using this principle, a theoretical valuation formula for options is derived. Since almost all corporate liabilities can be viewed as combinations of options, the formula and the analysis that led to it are also applicable to corporate liabilities such as common stock, corporate bonds, and warrants. In particular, the formula can be used to derive the discount that should be applied to a corporate bond because of the possibility of default.

The classic model of the temporal variation of speculative prices (Bachelier 1900) assumes that successive changes of a price Z(t) are independent Gaussian random variables. But, even if Z(t) is replaced by log Z(t),this model is contradicted by facts in four ways, at least:
(1)
Large price changes are much more frequent than predicted by the Gaussian; this reflects the “excessively peaked” (“leptokurtic”) character of price relatives, which has been well-established since at least 1915.
(2)
Large practically instantaneous price changes occur often, contrary to prediction, and it seems that they must be explained by causal rather than stochastic models.
(3)
Successive price changes do not “look” independent, but rather exhibit a large number of recognizable patterns, which are, of course, the basis of the technical analysis of stocks.
(4)
Price records do not look stationary, and statistical expressions such as the sample variance take very different values at different times; this nonstationarity seems to put a precise statistical model of price change out of the question.

The efficient market hypothesis has been widely tested and, with few exceptions, found consistent with the data in a wide variety of markets: the New York and American Stock Exchanges, the Australian, English, and German stock markets, various commodity futures markets, the Over-the-Counter markets, the corporate and government bond markets, the option market, and the market for seats on the New York Stock Exchange. Yet, in a manner remarkably similar to that described by Thomas Kuhn in his book, The Structure of Scientific Revolutions, we seem to be entering a stage where widely scattered and as yet incohesive evidence is arising which seems to be inconsistent with the theory. As better data become available (e.g., daily stock price data) and as our econometric sophistication increases, we are beginning to find inconsistencies that our cruder data and techniques missed in the past. It is evidence which we will not be able to ignore. The purpose of this special issue of the Journal of Financial Economics is to bring together a number of these scattered pieces of anomalous evidence regarding Market Efficiency. As Ball (1978) points out in his survey article: taken individually many scattered pieces of evidence on the reaction of stock prices to earnings announcements which are inconsistent with the theory don't amount to much. Yet viewed as a whole, these pieces of evidence begin to stack up in a manner which make a much stronger case for the necessity to carefully review both our acceptance of the efficient market theory and our methodological procedures.

The purpose of this paper is to investigate the structure of the class of market excess demand functions which can be generated by aggregating individual utility maximizing behavior. Among the results are: (i) in a region of the relative price domain an arbitrary polynomial function can be generated as an excess demand function for a particular commodity, and (ii) for any p in the relative price domain, a given configuration of excess demands and rates of change in excess demand can be generated at p if and only if it is consistent with Walras' Law.