A new family of penalty functions, adaptive to likelihood, is introduced for
model selection in general regression models. It arises naturally through
assuming certain types of prior distribution on the regression parameters. To
study stability properties of the penalized maximum likelihood estimator, two
types of asymptotic stability are defined. Theoretical properties, including
the parameter estimation consistency, model selection consistency, and
asymptotic stability, are established under suitable regularity conditions. An
efficient coordinate-descent algorithm is proposed. Simulation results and real
data analysis show that the proposed method has competitive performance in
comparison with existing ones.
The US economy is arguably following an unsustainable trajectory. The main indicators of this are a large current account deficit, a large federal budget deficit and trend-wise increasing costs of Social Security and Medicare. In this chapter, we will discuss these observations and to what extent the financial and economic crisis may have changed the outlook. Before this, we need to define what we mean by sustainability. An often used definition of sustainability is that the inter-temporal budget restriction is satisfied.
This paper aims to provide a practical example on the assessment and propagation of input uncertainty for option pricing when using tree-based methods. Input uncertainty is propagated into output uncertainty, reflecting that option prices are as unknown as the inputs they are based on. Option pricing formulas are tools whose validity is conditional not only on how close the model represents reality, but also on the quality of the inputs they use, and those inputs are usually not observable. We provide three alternative frameworks to calibrate option pricing tree models, propagating parameter uncertainty into the resulting option prices. We finally compare our methods with classical calibration-based results assuming that there is no options market established. These methods can be applied to pricing of instruments for which there is not an options market, as well as a methodological tool to account for parameter and model uncertainty in theoretical option pricing.
Analysis of multivariate time series is a common problem in areas like
finance and economics. The classical tool for this purpose are vector
autoregressive models. These however are limited to the modeling of linear and
symmetric dependence. We propose a novel copula-based model which allows for
non-linear and asymmetric modeling of serial as well as between-series
dependencies. The model exploits the flexibility of vine copulas which are
built up by bivariate copulas only. We describe statistical inference
techniques for the new model and demonstrate its usefulness in three relevant
applications: We analyze time series of macroeconomic indicators, of
electricity load demands and of bond portfolio returns.
The paper focuses on satisfaction with income and proposes a utility model built on two value systems, the `Ego' system - described as one own income assessment relatively to one own past and future income - and the `Alter' system - described as one own income assessment relatively to a reference group. We show how the union of these two value systems and the use of relative deprivation measures can lead to a model able to accommodate a wide range of theories on income and happiness. The model is then tested using the Consortium of Household Panels for European Socio-economic Research (CHER), a collection of 19 panel surveys including over 1.2 m. individual observations. We find absolute income to sit at the intersection between the `Ego' and the `Alter' systems and to play the most prominent role in explaining satisfaction with income. Relative deprivation is also found to be important for understanding the income-happiness nexus while we find income expectations to be less relevant once we control for absolute income. Overall, the `Alter' system (the cross-section comparison with others) seems to be more relevant in valuing income than the `Ego' system (the longitudinal self-comparison of income).
For computing exact designs of experiments under multiple resource
constraints, we developed a heuristic method related to the Detmax procedure.
To illustrate the performance of the heuristic, we computed D-efficient designs
for a block model with limits on the numbers of blocks, for a quadratic
regression model with simultaneous marginal and cost constraints, and for a
non-linear regression model with simultaneous direct and cost constraints. The
numerical examples demonstrate that the proposed heuristic generates comparable
or better results than competing algorithms, even in their specific domains of
We propose a numerical method to evaluate the performance of the emerging
Generalized Shiryaev--Roberts (GSR) change-point detection procedure in a
"minimax-ish" multi-cyclic setup where the procedure of choice is applied
repetitively (cyclically) and the change is assumed to take place at an unknown
time moment in a distant-future stationary regime. Specifically, the proposed
method is based on the integral-equations approach and uses the collocation
technique with the basis functions chosen so as to exploit a certain
change-of-measure identity and the GSR detection statistic's unique martingale
property. As a result, the method's accuracy and robustness improve, as does
its efficiency since using the change-of-measure ploy the Average Run Length
(ARL) to false alarm and the Stationary Average Detection Delay (STADD) are
computed simultaneously. We show that the method's rate of convergence is
quadratic and supply a tight upperbound on its error. We conclude with a case
study and confirm experimentally that the proposed method's accuracy and rate
of convergence are robust with respect to three factors: (a) partition fineness
(coarse vs. fine), (b) change magnitude (faint vs. contrast), and (c) the level
of the ARL to false alarm (low vs. high). Since the method is designed not
restricted to a particular data distribution or to a specific value of the GSR
detection statistic's headstart, this work may help gain greater insight into
the characteristics of the GSR procedure and aid a practitioner to design the
GSR procedure as needed while fully utilizing its potential.
For estimation and predictions of random fields it is increasingly
acknowledged that the kriging variance may be a poor representative of true
uncertainty. Experimental designs based on more elaborate criteria that are
appropriate for empirical kriging are then often non-space-filling and very
costly to determine. In this paper, we investigate the possibility of using a
compound criterion inspired by an equivalence theorem type relation to build
designs quasi-optimal for the empirical kriging variance, when space-filling
designs become unsuitable. Two algorithms are proposed, one relying on
stochastic optimization to explicitly identify the Pareto front, while the
second uses the surrogate criteria as local heuristic to chose the points at
which the (costly) true Empirical Kriging variance is effectively computed. We
illustrate the performance of the algorithms presented on both a simple
simulated example and a real oceanographic dataset.
The purpose of this article is to present the issue of algorithms of the change detection model for the political business cycle. Political business cycle issue is interesting in the context of the current political situation in Europe, ie, the progressive integration of the European Union countries and the wave of financial problems that affected the state, which has been regarded so far as economically stable. Monitoring of this phenomenon is characterized by the fact that we do not usually have full information about the behavior of business indexes before and after the change. It is assumed that we are observing a stochastic sequence whose mathematical model predicts a sudden change. The process is Markovian when the change moment is given. The initial problem of disorder detection is transformed to the optimal stopping of the observed sequence. In order to construct an algorithm for estimating the moment of change, we transform the task into an equivalent problem of optimal stopping based on the observed magnitude and some statistics. The analysis obtained from the transformation of the problem is the source of the change point estimation algorithms. The formula for the optimal decision functions is derived.
In this paper we consider a class of conditionally Gaussian state space models and discuss how they can provide a flexible and fairly simple tool for modelling financial time series, even in presence of different components in the series, or of stochastic volatility. Estimation can be computed by recursive equations, which provide the optimal solution under rather mild assumptions. In more general models, the filter equations can still provide approximate solutions. We also discuss how some models traditionally employed for analysing financial time series can be regarded in the state-space framework. Finally, we illustrate the models in two examples to real data sets.
This paper addresses the problem of estimating the tail index α of distributions with heavy, Pareto-type tails for dependent data, that is of interest in the areas of finance, insurance, environmental monitoring and teletraffic analysis. A novel approach based on the max self-similarity scaling behavior of block maxima is introduced. The method exploits the increasing lack of dependence of maxima over large size blocks, which proves useful for time series data.
We establish the consistency and asymptotic normality of the proposed max-spectrum estimator for a large class of m-dependent time series, in the regime of intermediate block-maxima. In the regime of large block-maxima, we demonstrate the distributional consistency of the estimator for a broad range of time series models including linear processes. The max-spectrum estimator is a robust and computationally efficient tool, which provides a novel time-scale perspective to the estimation of the tail exponents. Its performance is illustrated over synthetic and real data sets. Copyright
The paper presents new characterizations of the integer-valued moving average model. For four model variants we give moments and probability generating functions. Yule-Walker and conditional least squares estimators are obtained and studied by Monte Carlo simulation. A new generalized method of moment estimator based on probability generating functions is presented and shown to be consistent and asymptotically normal.The small sample performance is in some instances better than those of alternative estimators. The techniques are illustrated on a time series of traded stocks.
This paper contributes empirically to our understanding of informed traders. It analyzes traders' characteristics in a foreign exchange electronic limit order market via anonymous trader identities. We use six indicators of informed trading in a cross-sectional multivariate approach to identify traders with high price impact. More information is conveyed by those traders' trades which--simultaneously--use medium-sized orders (practice stealth trading), have large trading volume, are located in a financial center, trade early in the trading session, at times of wide spreads and when the order book is thin.
This work extends the study of hedging problems in markets with asymmetrical information: an agent is supposed to possess an additional information on market prices, unknown to the common investor.
The financial hedging problem for the influential and informed trader is modeled by a forward–backward stochastic differential equation, to be solved under an initial enlargement of the Brownian filtration. An existence and uniqueness theorem is proved under standard assumptions. The financial interpretation is derived, in terms of investment strategy for the informed and influential agent, as well as the conclusions concerning the general influenced market, in terms of completeness of the market. An example of such influenced and informed model is provided. Copyright
To quantify the operational risk capital charge under the current regulatory framework for banking supervision, referred to as Basel II, many banks adopt the Loss Distribution Approach. There are many modeling issues that should be resolved to use the approach in practice. In this paper we review the quantitative methods suggested in literature for implementation of the approach. In particular, the use of the Bayesian inference method that allows to take expert judgement and parameter uncertainty into account, modeling dependence and inclusion of insurance are discussed.
With survival data there is often interest not only in the survival time distribution but also in the residual survival time distribution. In fact, regression models to explain residual survival time might be desired. Building upon recent work of Kottas & Gelfand ["J. Amer. Statist. Assoc." 96 (2001) 1458], we formulate a semiparametric median residual life regression model induced by a semiparametric accelerated failure time regression model. We utilize a Bayesian approach which allows full and exact inference. Classical work essentially ignores covariates and is either based upon parametric assumptions or is limited to asymptotic inference in non-parametric settings. No regression modelling of median residual life appears to exist. The Bayesian modelling is developed through Dirichlet process mixing. The models are fitted using Gibbs sampling. Residual life inference is implemented extending the approach of Gelfand & Kottas ["J. Comput. Graph. Statist." 11 (2002) 289]. Finally, we present a fairly detailed analysis of a set of survival times with moderate censoring for patients with small cell lung cancer. Copyright 2003 Board of the Foundation of the Scandinavian Journal of Statistics..
We consider some inference problems concerning the drift parameters of multi-factors Vasicek model (or multivariate Ornstein–Uhlebeck process). For example, in modeling for interest rates, the Vasicek model asserts that the term structure of interest ...