ArticlePDF Available

Abstract

Economic theory failed to envisage even the possibility of a financial crisis like the present one. A new foundation is needed that takes into account the interplay between heterogeneous agents.
2 nature physics | VOL 5 | JANUARY 2009 | www.nature.com/naturephysics
commentary
Economics crisis
Thomas Lux and Frank Westerhoff
Economic theory failed to envisage even the possibility of a nancial crisis like the present one. A new
foundation is needed that takes into account the interplay between heterogeneous agents.
O
nce viewed as mystic monetary
engineer, Alan Greenspan, former
Chairman of the US Federal
Reserve, has been re-cast as irresponsible
villain, one who laid the ground for the
present worldwide nancial catastrophe.
Asked whether his ‘ideology’ had pushed
him to make decisions he now regrets,
Greenspan confessed
1
that he would have
deemed impossible the ongoing disruptions
to the nancial system and that “his belief
in deregulation had been shaken.
He is not the only one who has been
taken entirely by surprise. Most economists
did not in any way foresee the depth of
the current crisis, or even consider it
possible. Even those who warned
2
of over-
exuberance in the US housing market did
not have any clue about the impending
meltdown, which, to the shocked public,
looks as if some Dr Strangelove on Wall
Street had pushed the button on a nancial
Doomsday Device. Greenspans supposed
ideology certainly coincided widely with
that of mainstream economists who believe
in the self-regulating forces of unrestrained
nancial markets, the ‘eciency’ of
asset-price formation, and the increased
eciency in risk allocation and sharing
through the introduction of ever more
nancial instruments.
All of this is just the nance version
of that textbook economic paradigm,
‘homo economicus, who has unlimited
insightfulness and capability of deliberation
(economists typically speak of ‘rationality’).
is admirable person manages his
nancial aairs as a side-aspect of his
utility maximization problem, taking into
account all potential future happenings
with the correct probabilities. As there
is only one way to be perfectly rational,
this agent is usually the lone actor in
economic models — a representative
Robinson Crusoe.
Of course, this Crusoe has been oen
derided as a straw-man illustration of
the dominant paradigm, criticized by
non-mainstream economists, unbelieving
natural scientists and a similarly
unbelieving public. Still, the straw man
is alive, and was well — at least until the
current nancial crisis started to unfold.
Although the principles outlined above
are still the basis of most contemporary
scholarly activity in economics, there are
other trends. ese include innovative
work in ‘behavioural economics’ and
experimental work with human subjects —
recognized in the award of the 2002
Nobel Prize to Daniel Kahneman and
Vernon L. Smith — which have revealed
a plethora of behavioural patterns that
contradict the assumption of perfectly
rational behaviour.
However, these developments still occupy
only a marginal position. e widespread
perception within our profession is that
behavioural research delivers a curious set of
anomalies or exceptions that lack coherence,
and whose impact gets washed out in the
aggregate. In contrast, the mainstream
paradigm is seen as a more solid and
consistent framework. Economic policy
advice, particularly in nancial economics,
will therefore typically be based on a set of
axioms and hypotheses derived ultimately
from the Robinson Crusoe scenario. As
the prevailing nancial crisis cannot be
explained using these standard tools,
economic theory basically oers policy
makers little guidance about what to do in
the current situation.
A major problem is that despite many
renements, this is not at all a system based
on and conrmed by empirical research
(as the naive believer in ‘positive science
might expect). e vision (or ideology)
encapsulated in the mainstream approach
is of a more ‘pre-analytical’ nature and is
supported mainly by elegant but idealistic
models of the economy. Perfect rationality
and optimizing behaviour are used so
pervasively in economics education that
their basic tenets are taken for granted
as the principles ruling the real world,
despite all of the anomalies and exceptions
discovered in empirical research.
For instance, it would be hard
to nd supporting evidence for the
rmly held belief that more derivative
instruments — which should allow agents
to insure themselves better against the
stochastic wheels of fortune lead to a
better allocation of resources and thus
an increase in market eciency. is
assertion is based entirely on the benets
of contingent claims in the textbook
general-equilibrium model. Derived in
the abstract, the eciency gain through
derivatives is only a hypothesis, yet this is
not how economists are used to thinking
of such theorems: it is the mathematical
proof within the model economy that is
considered its validation, rather than any
empirical evidence.
A glance at real-life operations in
derivative markets easily shows why the
theory fails: instead of hedging away risk,
many market participants use derivatives
in an ‘anomalous’ way, to build up
speculative positions so as to prot from
higher returns, as long as the downside
risk does not materialize. e near disaster
brought about in the late 1990s by the
collapse of notorious hedge fund Long-
Term Capital Management (intellectually
based on modern derivative theory) should
have raised some doubts. If that was not
compelling enough, the present crisis
should constitute its ultimate rejection.
e dominance of the rational-agent
paradigm is intimately intertwined with
an even more cumbersomeconceptual
reductionism. As there can only be one
way to act fully rationally, everyone
should display exactly the same behaviour.
erefore, a representative agent would be
sucient. Taking both aspects together,
the typical format of current economic
models is that of a single household or rm
maximizing its utility or prot over a nite
or innite lifespan. Technically, this is a
dynamic programming problem.
To the shocked public, it looks
as if some Dr Strangelove
on Wall Street had pushed
the button on a nancial
Doomsday Device.
© 2009 Macmillan Publishers Limited. All rights reserved
nature physics | VOL 5 | JANUARY 2009 | www.nature.com/naturephysics 3
commentary
is methodological preference excludes
the study of interaction among economic
agents. However, most of what is relevant
and interesting in economic life has to do
with the interplay and connection between
diverse economic actors. e current crisis
is a perfect example of the importance of
interactions at various levels. It was the
interaction between highly connected
international nancial markets that has
generated the spillover from the US
subprime-mortgage problem to other layers
of the nancial system.
Securitization of credit risks enabled
lenders to sell various parts of their
mortgage portfolios to other nancial
institutions thus creating new links with
these buyers as well as, indirectly, among
them. Other new asset classes, such as
credit default swaps, added additional
new links between formerly unconnected
entities. is gives us a glimpse of how
nancial innovations have increased
the degree of connectivity within the
nancial system. It is well known that
highly connected systems might be ‘robust
yet fragile
3
, but such important aspects
have been out of reach of the mainstream
approach to economics.
In fact, the ubiquitous notion of
systemic risk’ signals that current events
concern the nancial system at an aggregate
level. For natural scientists, the distinction
between micro-level phenomena and those
originating on a macro-, system-wide scale
from the interaction of microscopic units
is clear. e overall systemic features of
the crisis would be seen as an emergent
phenomenon of the dispersed micro-
activity. To reduce these macro events to
the outcome of the decision process of a
single agent seems to be missing the point.
As with systemic risk, the notion of
coordination failure (a term oen used
to characterize the endogenous nature
of economic slumps and recessions)
in itself encapsulates a perspective of
the ‘more is dierent’ paradigm
4
, an
involuntary negative collective outcome
of a system of dispersed activity. Rather
than looking for the explanation in a
particularly odd case of a microscopic
dynamic programming problem, it would
seem much more plausible to investigate
the ‘logic of collective activity’ on the
macroscopic scale. However, due to the
conceptual reductionist philosophy,
macroeconomics has been entirely reduced
to microeconomic theory in the past few
decades by insisting on representative
rational agents. at the overall system
is dierent from its parts is plainly
incomprehensible from the viewpoint of
the ruling school of thought.
Economics has thus, by its methodology,
tied its own hands and prevented the
analysis of vital aspects of economic
systems. For example, despite the recent
surge of research in network theory, the
now apparent linkages between banks have
received scant attention. In the few papers
that have been published, the analyses are
of a static nature based on equilibrium
concepts and do not easily lend themselves
to empirical applications. e even smaller
number of studies using empirical data or
realistic models come from authors with
a background in physics
5
. Unfortunately,
the study of anything at a systemic level
has been dened away from economics by
the insistence on micro-foundations that
simply set the macro sphere equal to the
microscopic base unit.
What could be the way out of this
dilemma? In our view, a change of
methodological orientation in economics
is needed, to take into account the ‘more
is dierent’ paradigm. On the one hand,
economists need to take seriously the
various deviations from ‘rationality’
revealed by behavioural research. On the
other hand, however, to avoid getting lost
in a patchwork of behavioural biases and
anomalies, a new empirically based type of
micro-foundation is necessary — one that
stresses more the links between boundedly
rational agents rather than the agents
internal processes. It would, therefore,
also not be enough to replace the current
paradigm by a representative ‘non-rational
actor (as has sometimes been done in
recent literature).
e experience of the natural
sciences in coping with complex
systems would suggest a parsimonious
stochastic approach. Because agents
in large economic systems will display
heterogeneity in terms of their dierent
micro motives, degrees of deliberation and
information-processing capabilities, one
might hope that this variability of human
behaviour can be quantied in a tractable
way using statistical laws. Ongoing work
inspired by statistical physics shows that
relatively simple models with plausible
behavioural rules have the potential to
replicate key empirical regularities of
nancial markets
6
. In these models, direct
and indirect local and global interactions
between market participants are important
ingredients in understanding the dynamics
of nancial markets. Currently, similarly
simple stochastic models are being
developed in the study of the distribution
of income and wealth
7
, and some
economists have even taken this approach
to macroeconomic models
8
.
e apparent systemic vulnerability
of our globalized nancial markets has
brought to the fore another carelessly
neglected facet of economic interactions.
Most economic problems are emergent
phenomena of complex societies that
require a systemic perspective. A new
micro-foundation based on interactions
would be the missing macro counterpart
to the microeconomic regularities
revealed in behavioural economics. To
develop a proper perspective on systemic
phenomena, economics as a science should
take stock of the experience of the natural
sciences in handling complex systems with
strong interactions. A partial reorientation
in modelling principles and more
methodological exibility would enable
us to tackle more directly those problems
that seem to be most vital in our large,
globalized economic systems.
omas Lux is in the Department of Economics,
University of Kiel, Olshausenstraβe 40, D‑24118
Kiel, Germany, and is a member of the research
group ‘Risks in the Banking Sector’ of the Kiel
Institute for the World Economy.
e‑mail: lux@bwl.uni‑kiel.de
Frank Westerho is in the Department
of Economics, University of Bamberg,
Feldkirchenstraβe 21, D‑96045 Bamberg, Germany.
e‑mail: frank.westerho@uni‑bamberg.de
References
1. Andrews, E. L. Greenspan concedes error on regulation.
e New York Times online 23 October 2008.
2. Shiller, R. Irrational Exuberance 2nd edn (Princeton Univ.
Press, 2005).
3. Watts, D. J. Proc. Natl Acad. Sci. USA 99, 5766–5771 (2002).
4. Anderson, P. W. Science 177, 393–396 (1972).
5. Iori, G., Jafarey, S. & Padilla, F. J. Econ. Behav. Organ.
61, 525–542 (2006).
6. Lux, T. in Handbook on Financial Economics (eds Schenk-Hoppé, K.
& Hens, T.) (Elsevier, in the press).
7. Chatterjee, A., Yarlagadda, S. & Chakrabarti, B. (eds) Econophysics
of Wealth Distributions (Springer, 2005).
8. Aoki, M. & Yoshikawa, H. Reconstructing Macroeconomics:
A Perspective from Statistical Physics and Combinatorial Stochastic
Processes (Cambridge Univ. Press, 2007).
Acknowledgement
e authors are grateful to Mishael Milakovic for inspiring discussions.
Economics should take stock
of the experience of the
natural sciences in handling
complex systems with strong
interactions.
Economic theory offers
policy makers little guidance
about what to do in the
current situation.
© 2009 Macmillan Publishers Limited. All rights reserved
... The fundamental weakness in current macroeconomic theory is the absence of a consistent micro level foundation. Here we present a new microeconomic theory where the macro state of a system is the aggregate of states of the micro units as proposed by Lux and Westerhoff (2009) in the spirit of classical analytical mechanics. Throughout our framework, we define and apply a consistent unit system for economics presented by De Jong (1967), comparable to that of physics. ...
... As e.g. Lux and Westerhoff (2009) state, the neoclassical economic theory is widely known of its inability to model the behaviour of real economic phenomena. The most fundamental shortcoming in the prevailing neo-classical framework, acknowledged e.g. by Mas-Colell et al. (1995), is that it is essentially static in nature, whereas real economic systems are always dynamic. ...
... Our theory can explain observed dynamic economic phenomena also outside optimum states. Therefore, it can be used to simulate economic systems in a realistic way such as economic crises that the neo-classical framework is unable to forecast or handle, see Lux and Westerhoff (2009). Our theory has been tested with extensive simulations and two empirical evaluations, and it has been found consistent with real data, as shown in Estola and Dannenberg (2012); Estola (2015). ...
Article
Full-text available
1 The main weakness in the neoclassical theory of economics is its static nature. By a static model one cannot explain observed time paths of economic quantities, like the flows of production of firms, the flows of consumption of consumers, and the prices of goods. The error in the neoclassical framework is that there economic units are assumed to be in their optimum state and thus not willing to change their behaviour. Therefore, in neoclassical models a static equilibrium prevails. In this paper, the authors change this assumption so that economic units are assumed to be willing to improve their current state that may not be the optimal one. In this way, one can explain economic dynamics where every economic unit is changing its behaviour towards improving its welfare. The authors define the economic forces acting upon the production of firms, the consumption of consumers, and the prices of goods. They show that in this dynamic system, business cycles and bankruptcies of firms emerge in a natural way like in the real world. JEL: D11, D21, C63, C02.
... Además, con la desregulación de los mercados financieros a fines de la década de 1980, se llevó al crecimiento exponencial de la industria financiera y que junto al incremento de la velocidad y disminución en los costos computacionales condujo a la emergencia de grandes bases de datos que guardaban las ordenes de transacción hasta el orden de los milisegundos. Esta disponibilidad de datos condujo a que se propusieran nuevos modelos que tuvieran un mejor fundamento empírico desde la econofísica [29,18,30,31,32]. ...
... Se enfatiza en el comportamiento de colas gruesas para estas y su relación con la función característica a través de la transformada de Fourier. Es importante detallar que en general, las distribuciones de Lévy λ-estables asociadas a series de tiempo de logretornos son leptocúrticas, es decir, con curtosis mayor a 3 y asimétricas [62,63,31]. ...
Thesis
Full-text available
En este trabajo de grado se hace uso del formalismo de la integral de camino de Feynman para estudiar la evolución en el tiempo de los primeros 4 momentos de las distribuciones no gaussianas asociadas a datos empíricos de logretornos diarios del índice bursátil S&P500 desde enero de 1950 hasta abril del 2019. Para realizar lo anterior, primero se muestra que la serie de tiempo de estos logretornos se ajusta a una distribución de vuelos de Lévy estable, la serie de tiempo de volatilidades de los logretornos se ajusta a una distribución gamma y la serie de tiempo de logretornos absolutos, se ajusta a una distribución de Boltzmann, donde se observa la bondad del ajuste mediante el método de máxima verosimilitud. A continuación, mediante el cálculo del kernel de Feynman se considera la evolución temporal de la serie de tiempo de logretornos, volatilidades y valores absolutos de logretorno. Para este objetivo, en cada uno de los $3$ casos considerados, se define la función característica correspondiente al hamiltoniano de la integral de camino en el formalismo de tiempo imaginario, lo cual permite definir la función generadora de cumulantes y a partir de ella deducir expresiones analíticas de la evolución en el tiempo de los momentos de las distribuciones incluyendo el drift estocástico. Posteriormente, usando los datos empíricos se calcula la evolución en el tiempo de los primeros 4 momentos de las distribuciones de logretornos, volatilidades y logretornos absolutos, los cuales se comparan con los resultados obtenidos partiendo de las expresiones analíticas utilizando el ajuste de mínimo cuadrados. Se muestra que para que exista una correcta correspondencia se requiere para cada uno de los momentos determinar una ventana óptima que corre sobre toda la serie financiera y tomar el total de datos acumulados por ventana. Se obtiene que, en épocas de crisis financieras, en particular, la crisis del lunes negro e hipotecaria, existe una regularidad estadística en las fluctuaciones abruptas de los momentos de la distribución de probabilidad que se puede encontrar con la evolución en el tiempo de la serie financiera usando una ventana fija móvil cuyo tamaño es específico para cada caso considerado.
Article
We show that inner composition alignment networks derived for financial time-series data, studied in response to worldwide lockdown imposed in response to COVID-19 situation, show distinct patterns before, during and after lockdown phase. It is observed that significant couplings between companies as captured by inner composition alignment between time series, reduced considerably across the globe during lockdown and post-lockdown recovery period. The study of global community structure of the networks show that factions of companies emerge during recovery phase, with strong coupling within the members of the faction group, a trend which was absent before lockdown period. The study of strongly connected components of the networks further show that market has fragmented in response to COVID-19 situation. We find that most central firms as characterized by in-degree, out-degree and betweenness centralities belong to Chinese and Japanese economies, indicating a role played by these organizations in financial information propagation across the globe. We further observe that recovery phase of the lockdown period is strongly influenced by financial sector, which is one of the main result of this study. It is also observed that two different group of companies, which may not be co-moving, emerge across economies during COVID-19. We further notice that many companies in US and European economy tend to shield themselves from local influences.
Chapter
The profession of academic economics has been widely criticized for being excessively dependent on technical models based on unrealistic assumptions about rationality and individual behavior, and yet it remains a sparsely studied area. This volume presents a series of background readings on the profession by leading scholars in the history of economic thought and economic methodology. Adopting a fresh critique, the contributors investigate the individual incentives prevalent in academic economics, describing economists as rational actors who react to their intellectual environment and the incentives for economic research. Timely topics are addressed, including the financial crisis and the consequences for the discipline, as well as more traditional themes such as pluralism in research, academic organizations, teaching methodology, gender issues and professional ethics. This collection will appeal to scholars working on topics related to economic methodology and the teaching of economics.
Article
Recent empirical works have confirmed the importance of sentiment in asset pricing. In this paper, we propose that sentiment may not affect everyone in a homogeneous way. We construct a sentiment indicator taking into consideration behavioral heterogeneity of interacting investors. We find that sentiment contributes to several financial anomalies such as fat tails and volatility clustering of returns. More importantly, investor sentiment could be a significant source of financial market volatility. Our model with sentiment is able to replicate different types of crises, in which the crisis severity is enhanced with rise of sentiment sensitivity of chartist traders.
Article
Full-text available
The complexity of financial markets arise from the strategic interactions among agents trading stocks, which manifest in the form of vibrant correlation patterns among stock prices. Over the past few decades, complex financial markets have often been represented as networks whose interacting pairs of nodes are stocks, connected by edges that signify the correlation strengths. However, we often have interactions that occur in groups of three or more nodes, and these cannot be described simply by pairwise interactions but we also need to take the relations between these interactions into account. Only recently, researchers have started devoting attention to the higher-order architecture of complex financial systems, that can significantly enhance our ability to estimate systemic risk as well as measure the robustness of financial systems in terms of market efficiency. Geometry-inspired network measures, such as the Ollivier–Ricci curvature and Forman–Ricci curvature, can be used to capture the network fragility and continuously monitor financial dynamics. Here, we explore the utility of such discrete Ricci curvatures in characterizing the structure of financial systems, and further, evaluate them as generic indicators of the market instability. For this purpose, we examine the daily returns from a set of stocks comprising the USA S&P-500 and the Japanese Nikkei-225 over a 32-year period, and monitor the changes in the edge-centric network curvatures. We find that the different geometric measures capture well the system-level features of the market and hence we can distinguish between the normal or ‘business-as-usual’ periods and all the major market crashes. This can be very useful in strategic designing of financial systems and regulating the markets in order to tackle financial instabilities.
Article
Distress propagation occurs in connected networks, its rate and extent being dependent on network topology. To study this, we choose economic production networks as a paradigm. An economic network can be examined at many levels– linkages among individual agents (microscopic), among firms/sectors (mesoscopic) or among countries (macroscopic). New emergent dynamical properties appear at every level, so the granularity matters. For viral epidemics, even an individual node may act as an epicenter of distress and potentially affect the entire network. Economic networks, however, are known to be immune at the micro-levels and more prone to failure in the meso/macro-levels. We propose a dynamical interaction model to characterize the mechanism of distress propagation, across different modules of a network, initiated at different epicenters. Vulnerable modules often lead to large degrees of destabilization. We demonstrate our methodology using a unique empirical data-set of input–output linkages across 0.14 million firms in one administrative state of India, a developing economy. The network has multiple hub-and-spoke structures that exhibits moderate disassortativity, which varies with the level of coarse-graining. The novelty lies in characterizing the production network at different levels of granularity or modularity, and finding ‘too-big-to-fail’ modules supersede ‘too-central-to-fail’ modules in distress propagation.
Article
There is substantial evidence to suggest that the book-to-market (BM) ratio is an important factor in explaining stock market returns. Its role has proved difficult to isolate, however, due to statistical problems in its construction and to its observational equivalence to a number of risk and behavioural explanations. In addition, now widely recognised complex behaviour in financial markets has called into question modelling approaches that are limited in their ability to uncover relationships that are possibly masked during financial crises, for example. As one response, our research explores the value of a newly applied technique which examines the topological properties of minimum spanning trees as applied to both the BM ratio and market returns. Our intention is to identify and report investment signals as determined by the BM ratio and to assess the relationships of these signals to returns outcomes. The approach enables highly nonlinear behaviour to be addressed and the relationships we set out to capture to be reported in novel ways. We motivate and evidence a previously unreported role for BM as an investment signal which is effective over varying stock market conditions, including the financial crisis that began in 2008.
Article
Full-text available
The purpose of this paper is to discuss the general risk assessment under the Hologram framework for the enterprise based on big data language; and to illustrate the Hologram as a new tool for establishing a mechanism to evaluate SMEs growth and change in financial technology dynamically (here we mainly focus on SMEs as they are one of the very important classes for enterprises with less information available from financial accounting report and associated assets. Indeed, the approach discussed here is applicable to general enterprises). The key idea of our new approach is to introduce and use the “Hologram” (similar to, “holographic portrait” used in portrait holography), a platform for data fusion dynamically, as a tool and mechanism to describe the dynamic evolution of SMEs based on their business dynamic behavior. Through processing structured and/or unstructured data in terms of “related-party” information sets which analyze (1) “investment” and (2) “management” information provided by SMEs’ business behavior, and extracting “Risk Genes” from complex financial network structures in the business ecosystem, we can establish a “good” or “bad” rating for SMEs by using data fusion dynamically and financial technology. This method to assess SMEs is a new approach to evaluating SMEs’ development dynamically based on the network structure information of enterprise and business behavior. The framework introduced in this paper for the dynamic mechanism of SMEs’ development and evolution allows us to assess the risk of any SMEs (in particular to evaluate SMEs’ loan applications) even not available for critical data required in traditional finance analysis including information such as financial accounting and associated assets, etc. This new “Hologram” approach for SMEs assessment is a pioneering innovation that incorporates big data and financial technology for inclusive financial services in practical application. Ultimately, the Hologram approach offers a new theoretical solution for the long-standing problem of credit risk assessment for SMEs and individuals in practice. Since the information embedded in SMEs’ business behavior reveals the competition and cooperation mechanism that drives its stochastic resonance (SR) behavior which is associated with successful SMEs development, the two concepts of SAI and URR under the Hologram approach to risk assessment that identifies if an SME is “good” are based on the network generated from an SMEs’ related-party information in terms of “investment” and “management” dynamically, along with other available information such as related investment capital and risk control. Significantly, the Hologram approach to risk assessment for SMEs does not require critical data of traditional financial account and related assets, etc. which heavily depend on financial accounting and associated assets used by financial risk analysis in practice. Using big data and FinTech Hologram method discussed in this paper utilizes the related-party information (in term of investment and management) of each SME which exists in an embedded business network to overcome the situation for SMEs which always have not or have not enough in providing accounting and associated asset information in the practice. By the feature of each Hologram for a given SME, one always has the related-party information in terms of either investment, or management dynamically, which is indeed also an explanation for the reason why the new approach proposed only comes true only until the era of big data’s occurring by using ideas from financial technology today. Furthermore, this paper explores the implementation of the “Holo Credit Loan”, a pure credit loan without any collateral and guarantee launched in 2016, as practical applications of the Hologram approach. We illustrate the framework of SMEs risk assessment under the Holograms new theoretical basis for solving the long-standing problem of credit risk assessment for SMEs (and individuals). Moreover, this paper’ conclusion will address the performance of the “Holo Credit Loan”.
Book
Understanding the distribution of income and wealth in an economy has been a classic problem in economics for the last hundred years. Apart from the rapidly decaying number density of people with their income crossing over to a robust power law for the very rich, known as the Pareto power-law, after Vilfredo Pareto. With the availability of accurate data from finance/income-tax departments of various countries, several robust features of the income distribution have been established. This field had been dormant for more than a decade, although accurate data had been accumulated over this period. The recent enthusiasm comes mainly from the physicists modelling of markets in analogy with physical systems like gases etc. This is the first book reporting exhaustively on these developments over the last decade by leaders in the field The book will report on the major models developed mainly by the (econo-) physicists over the last decade. Almost all the major physicists and economists involved in these studies will review their latest work and the associated developments.
Article
We simulate interbank lending. Each bank faces fluctuations in liquid assets and stochastic investment opportunities that mature with delay, creating the risk of liquidity shortages. An interbank market lets participants pool this risk but also creates the potential for one bank’s crisis to propagate through the system. We study banking systems with homogeneous banks, as well as systems in which banks are heterogeneous. With homogeneous banks, an interbank market unambiguously stabilizes the system. With heterogeneity, knock-on effects become possible, but the stabilizing role of interbank lending remains so that the interbank market can play an ambiguous role.
Article
Broken symmetry and the nature of the hierarchical structure of science.
Article
The origin of large but rare cascades that are triggered by small initial shocks is a phenomenon that manifests itself as diversely as cultural fads, collective action, the diffusion of norms and innovations, and cascading failures in infrastructure and organizational networks. This paper presents a possible explanation of this phenomenon in terms of a sparse, random network of interacting agents whose decisions are determined by the actions of their neighbors according to a simple threshold rule. Two regimes are identified in which the network is susceptible to very large cascades-herein called global cascades-that occur very rarely. When cascade propagation is limited by the connectivity of the network, a power law distribution of cascade sizes is observed, analogous to the cluster size distribution in standard percolation theory and avalanches in self-organized criticality. But when the network is highly connected, cascade propagation is limited instead by the local stability of the nodes themselves, and the size distribution of cascades is bimodal, implying a more extreme kind of instability that is correspondingly harder to anticipate. In the first regime, where the distribution of network neighbors is highly skewed, it is found that the most connected nodes are far more likely than average nodes to trigger cascades, but not in the second regime. Finally, it is shown that heterogeneity plays an ambiguous role in determining a system's stability: increasingly heterogeneous thresholds make the system more vulnerable to global cascades; but an increasingly heterogeneous degree distribution makes it less vulnerable.
Article
Labour productivity distribution (dispersion) is studied both theoretically and empirically. Superstatistics is presented as a natural theoretical framework for productivity. The demand index $\kappa$ is proposed within this framework as a new business index. Japanese productivity data covering small-to-medium to large firms from 1996 to 2006 is analyzed and the power-law for both firms and workers is established. The demand index $\kappa$ is evaluated in the manufacturing sector. A new discovery is reported for the nonmanufacturing (service) sector, which calls for expansion of the superstatistics framework to negative temperature range.
  • G Iori
  • S Jafarey
  • F Padilla
Iori, G., Jafarey, S. & Padilla, F. J. Econ. Behav. Organ. 61, 525-542 (2006).
  • P W Anderson
Anderson, P. W. Science 177, 393-396 (1972).