# Applied Stochastic Models in Business and Industry

Published by Wiley

Online ISSN: 1526-4025

Published by Wiley

Online ISSN: 1526-4025

Publications

Article

A new family of penalty functions, adaptive to likelihood, is introduced for
model selection in general regression models. It arises naturally through
assuming certain types of prior distribution on the regression parameters. To
study stability properties of the penalized maximum likelihood estimator, two
types of asymptotic stability are defined. Theoretical properties, including
the parameter estimation consistency, model selection consistency, and
asymptotic stability, are established under suitable regularity conditions. An
efficient coordinate-descent algorithm is proposed. Simulation results and real
data analysis show that the proposed method has competitive performance in
comparison with existing ones.

…

Article

The US economy is arguably following an unsustainable trajectory. The main indicators of this are a large current account deficit, a large federal budget deficit and trend-wise increasing costs of Social Security and Medicare. In this chapter, we will discuss these observations and to what extent the financial and economic crisis may have changed the outlook. Before this, we need to define what we mean by sustainability. An often used definition of sustainability is that the inter-temporal budget restriction is satisfied.

…

Article

This paper aims to provide a practical example on the assessment and propagation of input uncertainty for option pricing when using tree-based methods. Input uncertainty is propagated into output uncertainty, reflecting that option prices are as unknown as the inputs they are based on. Option pricing formulas are tools whose validity is conditional not only on how close the model represents reality, but also on the quality of the inputs they use, and those inputs are usually not observable. We provide three alternative frameworks to calibrate option pricing tree models, propagating parameter uncertainty into the resulting option prices. We finally compare our methods with classical calibration-based results assuming that there is no options market established. These methods can be applied to pricing of instruments for which there is not an options market, as well as a methodological tool to account for parameter and model uncertainty in theoretical option pricing.

…

Article

Analysis of multivariate time series is a common problem in areas like
finance and economics. The classical tool for this purpose are vector
autoregressive models. These however are limited to the modeling of linear and
symmetric dependence. We propose a novel copula-based model which allows for
non-linear and asymmetric modeling of serial as well as between-series
dependencies. The model exploits the flexibility of vine copulas which are
built up by bivariate copulas only. We describe statistical inference
techniques for the new model and demonstrate its usefulness in three relevant
applications: We analyze time series of macroeconomic indicators, of
electricity load demands and of bond portfolio returns.

…

Article

This paper provides new results for the range inter-events process of a birth–death random walk. Motivations for determining and using the inter-range event distribution have two sources. First, the analytical results we obtain are simpler than the range process and make it easier, therefore, to use statistics based on the inter-range event process. Further, most of the results for the range process are based on long-run statistical properties which limits their practical usefulness while inter-range events are by their nature ‘short-term’ statistics. Second, in many cases, data on amplitude change are easier to obtain and calculate than range and standard deviation processes. As a results, the predicted statistical properties of the inter-range event process can provide an analytical foundation for the development of statistical tests that may be used practically. Application to outlier detection, volatility and time-series analysis is discussed. Copyright © 2001 John Wiley & Sons, Ltd.

…

Article

The paper focuses on satisfaction with income and proposes a utility model built on two value systems, the `Ego' system - described as one own income assessment relatively to one own past and future income - and the `Alter' system - described as one own income assessment relatively to a reference group. We show how the union of these two value systems and the use of relative deprivation measures can lead to a model able to accommodate a wide range of theories on income and happiness. The model is then tested using the Consortium of Household Panels for European Socio-economic Research (CHER), a collection of 19 panel surveys including over 1.2 m. individual observations. We find absolute income to sit at the intersection between the `Ego' and the `Alter' systems and to play the most prominent role in explaining satisfaction with income. Relative deprivation is also found to be important for understanding the income-happiness nexus while we find income expectations to be less relevant once we control for absolute income. Overall, the `Alter' system (the cross-section comparison with others) seems to be more relevant in valuing income than the `Ego' system (the longitudinal self-comparison of income).

…

Article

For computing exact designs of experiments under multiple resource
constraints, we developed a heuristic method related to the Detmax procedure.
To illustrate the performance of the heuristic, we computed D-efficient designs
for a block model with limits on the numbers of blocks, for a quadratic
regression model with simultaneous marginal and cost constraints, and for a
non-linear regression model with simultaneous direct and cost constraints. The
numerical examples demonstrate that the proposed heuristic generates comparable
or better results than competing algorithms, even in their specific domains of
application.

…

Article

We propose a numerical method to evaluate the performance of the emerging
Generalized Shiryaev--Roberts (GSR) change-point detection procedure in a
"minimax-ish" multi-cyclic setup where the procedure of choice is applied
repetitively (cyclically) and the change is assumed to take place at an unknown
time moment in a distant-future stationary regime. Specifically, the proposed
method is based on the integral-equations approach and uses the collocation
technique with the basis functions chosen so as to exploit a certain
change-of-measure identity and the GSR detection statistic's unique martingale
property. As a result, the method's accuracy and robustness improve, as does
its efficiency since using the change-of-measure ploy the Average Run Length
(ARL) to false alarm and the Stationary Average Detection Delay (STADD) are
computed simultaneously. We show that the method's rate of convergence is
quadratic and supply a tight upperbound on its error. We conclude with a case
study and confirm experimentally that the proposed method's accuracy and rate
of convergence are robust with respect to three factors: (a) partition fineness
(coarse vs. fine), (b) change magnitude (faint vs. contrast), and (c) the level
of the ARL to false alarm (low vs. high). Since the method is designed not
restricted to a particular data distribution or to a specific value of the GSR
detection statistic's headstart, this work may help gain greater insight into
the characteristics of the GSR procedure and aid a practitioner to design the
GSR procedure as needed while fully utilizing its potential.

…

Article

For estimation and predictions of random fields it is increasingly
acknowledged that the kriging variance may be a poor representative of true
uncertainty. Experimental designs based on more elaborate criteria that are
appropriate for empirical kriging are then often non-space-filling and very
costly to determine. In this paper, we investigate the possibility of using a
compound criterion inspired by an equivalence theorem type relation to build
designs quasi-optimal for the empirical kriging variance, when space-filling
designs become unsuitable. Two algorithms are proposed, one relying on
stochastic optimization to explicitly identify the Pareto front, while the
second uses the surrogate criteria as local heuristic to chose the points at
which the (costly) true Empirical Kriging variance is effectively computed. We
illustrate the performance of the algorithms presented on both a simple
simulated example and a real oceanographic dataset.

…

Article

The purpose of this article is to present the issue of algorithms of the change detection model for the political business cycle. Political business cycle issue is interesting in the context of the current political situation in Europe, ie, the progressive integration of the European Union countries and the wave of financial problems that affected the state, which has been regarded so far as economically stable. Monitoring of this phenomenon is characterized by the fact that we do not usually have full information about the behavior of business indexes before and after the change. It is assumed that we are observing a stochastic sequence whose mathematical model predicts a sudden change. The process is Markovian when the change moment is given. The initial problem of disorder detection is transformed to the optimal stopping of the observed sequence. In order to construct an algorithm for estimating the moment of change, we transform the task into an equivalent problem of optimal stopping based on the observed magnitude and some statistics. The analysis obtained from the transformation of the problem is the source of the change point estimation algorithms. The formula for the optimal decision functions is derived.

…

Article

In this paper we consider a class of conditionally Gaussian state space models and discuss how they can provide a flexible and fairly simple tool for modelling financial time series, even in presence of different components in the series, or of stochastic volatility. Estimation can be computed by recursive equations, which provide the optimal solution under rather mild assumptions. In more general models, the filter equations can still provide approximate solutions. We also discuss how some models traditionally employed for analysing financial time series can be regarded in the state-space framework. Finally, we illustrate the models in two examples to real data sets.

…

Article

This paper addresses the problem of estimating the tail index α of distributions with heavy, Pareto-type tails for dependent data, that is of interest in the areas of finance, insurance, environmental monitoring and teletraffic analysis. A novel approach based on the max self-similarity scaling behavior of block maxima is introduced. The method exploits the increasing lack of dependence of maxima over large size blocks, which proves useful for time series data.
We establish the consistency and asymptotic normality of the proposed max-spectrum estimator for a large class of m-dependent time series, in the regime of intermediate block-maxima. In the regime of large block-maxima, we demonstrate the distributional consistency of the estimator for a broad range of time series models including linear processes. The max-spectrum estimator is a robust and computationally efficient tool, which provides a novel time-scale perspective to the estimation of the tail exponents. Its performance is illustrated over synthetic and real data sets. Copyright

…

Article

The paper presents new characterizations of the integer-valued moving average model. For four model variants we give moments and probability generating functions. Yule-Walker and conditional least squares estimators are obtained and studied by Monte Carlo simulation. A new generalized method of moment estimator based on probability generating functions is presented and shown to be consistent and asymptotically normal.The small sample performance is in some instances better than those of alternative estimators. The techniques are illustrated on a time series of traded stocks.

…

Article

This paper contributes empirically to our understanding of informed traders. It analyzes traders' characteristics in a foreign exchange electronic limit order market via anonymous trader identities. We use six indicators of informed trading in a cross-sectional multivariate approach to identify traders with high price impact. More information is conveyed by those traders' trades which--simultaneously--use medium-sized orders (practice stealth trading), have large trading volume, are located in a financial center, trade early in the trading session, at times of wide spreads and when the order book is thin.

…

Article

This work extends the study of hedging problems in markets with asymmetrical information: an agent is supposed to possess an additional information on market prices, unknown to the common investor.
The financial hedging problem for the influential and informed trader is modeled by a forward–backward stochastic differential equation, to be solved under an initial enlargement of the Brownian filtration. An existence and uniqueness theorem is proved under standard assumptions. The financial interpretation is derived, in terms of investment strategy for the informed and influential agent, as well as the conclusions concerning the general influenced market, in terms of completeness of the market. An example of such influenced and informed model is provided. Copyright

…

Article

To quantify the operational risk capital charge under the current regulatory framework for banking supervision, referred to as Basel II, many banks adopt the Loss Distribution Approach. There are many modeling issues that should be resolved to use the approach in practice. In this paper we review the quantitative methods suggested in literature for implementation of the approach. In particular, the use of the Bayesian inference method that allows to take expert judgement and parameter uncertainty into account, modeling dependence and inclusion of insurance are discussed.

…

Article

With survival data there is often interest not only in the survival time distribution but also in the residual survival time distribution. In fact, regression models to explain residual survival time might be desired. Building upon recent work of Kottas & Gelfand ["J. Amer. Statist. Assoc." 96 (2001) 1458], we formulate a semiparametric median residual life regression model induced by a semiparametric accelerated failure time regression model. We utilize a Bayesian approach which allows full and exact inference. Classical work essentially ignores covariates and is either based upon parametric assumptions or is limited to asymptotic inference in non-parametric settings. No regression modelling of median residual life appears to exist. The Bayesian modelling is developed through Dirichlet process mixing. The models are fitted using Gibbs sampling. Residual life inference is implemented extending the approach of Gelfand & Kottas ["J. Comput. Graph. Statist." 11 (2002) 289]. Finally, we present a fairly detailed analysis of a set of survival times with moderate censoring for patients with small cell lung cancer. Copyright 2003 Board of the Foundation of the Scandinavian Journal of Statistics..

…

Article

We consider some inference problems concerning the drift parameters of multi-factors Vasicek model (or multivariate Ornstein–Uhlebeck process). For example, in modeling for interest rates, the Vasicek model asserts that the term structure of interest ...

…

Article

In contrast to traditional regression analysis, latent variable modelling (LVM) can explicitly differentiate between measurement errors and other random disturbances in the specification and estimation of econometric models. This paper argues that LVM could be a promising approach to test economic theories because applied research in business and economics is based on statistical information, which is frequently inaccurately measured. Considering the theory of industry-price determination, where the price variables involved are known to include a large measurement error, a latent variable, structural-equations model is constructed and applied to data on 7381 product categories classified into 295 manufacturing industries of the USA economy. The obtained estimates, compared and evaluated against a traditional regression model fitted to the same data, show the advantages of the LVM analytical framework, which could lead a long drawn-out conflict between empirical results and theory to a satisfactory reconciliation. Copyright © 2003 John Wiley & Sons, Ltd.

…

Article

Until now, data mining statistical techniques have not been used to improve the prediction of abnormal stock returns using insider trading data. Consequently, an investigation using neural network analysis was initiated. The research covered 343 companies for a period of 4½ years. Study findings revealed that the prediction of abnormal returns could be enhanced in the following ways: (1) extending the time of the future forecast up to 1 year; (2) increasing the period of back aggregated data; (3) narrowing the assessment to certain industries such as electronic equipment and business services and (4) focusing on small and midsize rather than large companies. Copyright © 2002 John Wiley & Sons, Ltd.

…

Article

This paper attempts to study the dividend payments in a compound Poisson surplus process with debit interest. Dividends are paid to the shareholders according to a barrier strategy. An alternative assumption is that business can go on after ruin, as long as it is profitable. When the surplus is negative, a debit interest is applied. At first, we obtain the integro-differential equations satisfied by the moment-generating function and moments of the discounted dividend payments and we also prove the continuous property of them at zero. Then, applying these results, we get the explicit expressions of the moment-generating function and moments of the discounted dividend payments for exponential claims. Furthermore, we discuss the optimal dividend barrier when the claim sizes have a common exponential distribution. Finally, we give the numerical examples for exponential claims and Erlang (2) claims. Copyright © 2008 John Wiley & Sons, Ltd.

…

Article

Considering absolute log returns as a proxy for stochastic volatility, the influence of explanatory variables on absolute log returns of ultra high frequency data is analysed. The irregular time structure and time dependency of the data is captured by utilizing a continuous time ARMA(p,q) process. In particular, we propose a mixed effect model class for the absolute log returns. Explanatory variable information is used to model the fixed effects, whereas the error is decomposed in a non-negative Lévy driven continuous time ARMA(p,q) process and a market microstructure noise component. The parameters are estimated in a state space approach. In a small simulation study the performance of the estimators is investigated. We apply our model to IBM trade data and quantify the influence of bid-ask spread and duration on a daily basis. To verify the correlation in irregularly spaced data we use the variogram, known from spatial statistics. Copyright © 2006 John Wiley & Sons, Ltd.

…

Article

Sales promotions such as temporary price reductions are frequently used by managers to stimulate sales in the short run. Marketing academics and practitioners tend to rely on price elasticities to summarize sales promotion effects. Although elasticities have some attractive benefits such as the invariance to measurement units, they have led to three misinterpretations in the marketing literature, as described in this paper. The proper theoretical and managerial interpretation of sales promotion effects is obtained by expressing effects in terms of absolute sales. Copyright © 2005 John Wiley & Sons, Ltd.

…

Article

An accelerometer is a transducer that allows measuring the acceleration acting on a structure. Physically, an accelerometer consists of a central mass suspended by thin and flexible arms and its performance is highly dependent on the dimensions of both the mass and the arms. The two most important parameters when evaluating the performance of these devices are the sensitivity and the operating frequency range (or bandwidth), the latter one being limited to of the resonance frequency. Therefore, it is very convenient to gain knowledge on how changes in the dimensions of the mass and arms affect the value of the natural frequency of the accelerometer, as it will provide guidelines to design accelerometers that fulfil frequency requirements of a specific application. A quadratic polynomial function of the natural logarithm of the frequency versus geometrical factors has been obtained using response surface methodology approach. A faced-centered cube design was used in the experimentation. The data were obtained conducting computer simulations using finite element design techniques. A better understanding of how these variables affect the value of frequency has been reached, which will be very useful for the device design purposes. Copyright © 2009 John Wiley & Sons, Ltd.

…

Article

By approximating the distribution of the sum of correlated lognormals with some log-extended-skew-normal distribution, we present closed-form approximation formulae for pricing both Asian and basket options. Numerical comparison shows that our formulae provide both computational simplicity and accuracy. Copyright © 2008 John Wiley & Sons, Ltd.

…

Article

This article examines the length of the cycles in the gross domestic product (GDP) real per capita series of 15 countries by means of new statistical techniques based on unit root cycles. We propose tests for unit root cycles at each of the frequencies of the process. Using this approach, we are able to approximate the number of periods per cycle. The results show that the cycles have a periodicity of approximately six years when the disturbances are white noise. However, if we permit autocorrelation, they may also occur at smaller intervals of time. Copyright © 2006 John Wiley & Sons, Ltd.

…

Article

We use dynamic style analysis to unveil the strategies followed by Brazilian actuarial funds from January 2004 to August 2008 and investigate whether managers’ decisions were compatible with the intention of protecting the investor against the negative effects of inflation. The main goal of this paper is to show that this methodology is suitable for allowing insurance companies to increase their capacity to monitor the behavior of portfolios and to control the amount of risk they assume. The basic steps of the method are to build and/or choose market indexes capable of characterizing the returns of the main securities available and to apply restricted linear state space models estimated with a Kalman filter with exact initialization. The main conclusions of this paper are the following: (1) the use of exact initialization of the Kalman filter promotes numerical stability; (2) there is no need to consider the entire set of market indicators because a subset containing only three indexes spans the relevant space of investment opportunities; and (3) the actuarial funds’ resources were primarily invested in inflation-indexed bonds, but fund managers also left room to adjust their exposure to other assets not directly related to the objective of providing protection against inflation. Copyright © 2011 John Wiley & Sons, Ltd.

…

Article

The quality of a production process is often judged by a quality assurance audit, which is essentially a structured system of sampling inspection plan. The defects of sampled products are assessed and compared with a quality standard, which is determined from a tradeoff among manufacturing costs, operating costs and customer needs. In this paper, we propose a new hierarchical Bayes quality measurement plan that assumes an implicit prior for the hyperparameters. The resulting posterior means and variances are obtained adaptively using a parametric bootstrap method. Published in 2009 by John Wiley & Sons, Ltd.

…

Article

This paper applies a geo-additive generalized linear mixed model to describe the spatial variation in the prevalence of cough among children under 5 years of age using the 2000 Demographic and Health survey (DHS) data from Malawi. Of particular interest in the analysis were the small area effect of geographical locations (districts) where the child lives in the time of the survey and the effect of the metrical covariate (child's age) which was assumed to be nonlinear and estimated nonparametrically. The model included other categorical covariates in the usual parametric form. We assign appropriate priors, within a Bayesian context, for the geographical location, vector of the unknown nonlinear smooth functions and a further vector of fixed effect parameters. For example, the spatial effects were modelled via Bayesian prior specifications reflecting spatial heterogeneity globally and relative homogeneity among neighbouring districts, thus a Markov random field prior is assumed. Inference is fully Bayesian and uses recent Markov chain Monte Carlo techniques. Copyright © 2006 John Wiley & Sons, Ltd.

…

Article

In count data regression there can be several problems that prevent the use of the standard Poisson log-linear model: overdispersion, caused by unobserved heterogeneity or correlation, excess of zeros, non-linear effects of continuous covariates or of time scales, and spatial effects. We develop Bayesian count data models that can deal with these issues simultaneously and within a unified inferential approach. Models for overdispersed or zero-inflated data are combined with semiparametrically structured additive predictors, resulting in a rich class of count data regression models. Inference is fully Bayesian and is carried out by computationally efficient MCMC techniques. Simulation studies investigate performance, in particular how well different model components can be identified. Applications to patent data and to data from a car insurance illustrate the potential and, to some extent, limitations of our approach. Copyright © 2006 John Wiley & Sons, Ltd.

…

Article

We examine the long-run average availability and cost rate of a maintained system which deteriorates according to a random-shock process. Shocks arrive according to a Poisson process. The system fails whenever the cumulative damage exceeds a given threshold. The system's failures are not self-announcing, hence, failures must be detected via inspections. The system is inspected at periodic or exponentially distributed intervals. Systems are replaced by preventive maintenance or after failure (corrective maintenance), whichever occurs first. When the distribution function of the shock magnitudes belongs to the class of subexponential distributions, we obtain simple approximations for the availability and the cost rate. Copyright © 2007 John Wiley & Sons, Ltd.

…

Article

Shapley value regression consists of assessing relative importance and accordingly adjusting regression coefficients. It is argued that adjustment of coefficients is unnecessary and even misleading for practically relevant situations. Examples are given, and an alternative procedure is proposed for situations for which the coefficients are requested to have a certain sign. Copyright © 2009 John Wiley & Sons, Ltd.

…

Article

This paper presents a hierarchical Bayesian analysis of the partial adjustment model of financial ratios using mixture models, an approach that allows us to estimate the distribution of the adjustment coefficients. More particularly, it enables us to analyse speed of reaction in the presence of shocks affecting financial ratios objectives as a basis to establish homogenous groups of firms. The proposed methodology is illustrated by examining a set of ratios for a sample of firms operating in the U.S. manufacturing sector. Copyright © 2007 John Wiley & Sons, Ltd.

…

Article

In this paper we develop a Bayesian procedure for feedback adjustment and control of a single process. We replace the usual exponentially weighted moving average (EWMA) predictor by a predictor of a local level model. The novelty of this approach is that the noise variance ratio (NVR) of the local level model is assumed to change stochastically over time. A multiplicative time series model is used to model the evolution of the NVR and a Bayesian algorithm is developed giving the posterior and predictive distributions for both the process and the NVR. The posterior distribution of the NVR allows the modeller to judge better and evaluate the performance of the model. The proposed algorithm is semi-conjugate in the sense that it involves conjugate gamma/beta distributions as well as one step of simulation. The algorithm is fast and is found to outperform the EWMA and other methods. An example considering real data from the microelectronic industry illustrates the proposed methodology. Copyright © 2006 John Wiley & Sons, Ltd.

…

Article

This paper deals with the problem of selecting profitable customer orders sequentially arriving at a company operating in service industries with multiple servers in which two classes of services are provided. The first class of service is designed to meet the particular needs of customers; and the company (1) makes a decision on whether to accept or to reject the order for this service (admission control) and (2) decides a price of the order and offers it to an arriving customer (pricing control). The second class of service is provided as a sideline, which prevents servers from being idle when the number of customer orders for the first class is less than the number of servers. This yields the sideline profit. A cost is paid to search for customer orders, which is called the search cost. In the context of search cost, the company has an option whether to conduct the search or not. In this paper, we discuss both admission control and pricing control problems within an identical framework as well as examine the structure of the optimal policies to maximize the total expected net profit gained over an infinite planning horizon. We show that when the sideline profit is large, the optimal policies may not be monotone in the number of customer orders in the system. Copyright © 2008 John Wiley & Sons, Ltd.

…

Article

Recent reforms intended to promote more accountable and responsive government have increased public attention to performance analysis and accelerated the production and use of information on agency performance and public program outcomes. Drawing from cases and empirical studies, this presentation considers questions about what should count as evidence, how it should be communicated, who should judge the quality and reliability of evidence and performance information, and how to achieve a balance between processes that produce rigorous information for decision making and those that foster democratic governance and accountability. Promising directions are suggested for efforts to improve government effectiveness through the use of more rigorous information in decision making, along with acknowledgment of the limitations and risks associated with such efforts. Copyright © 2008 John Wiley & Sons, Ltd.

…

Article

Random assignment experiments are discussed by drawing parallels between issues in performance management studies and in clinical trials. In addition, the need for statistical rigour and for measures of uncertainty in performance management tools is highlighted. Copyright © 2008 John Wiley & Sons, Ltd.

…

Article

Recent reforms intended to promote more accountable and responsive government have increased public attention to performance analysis and accelerated the production and use of information on agency performance and public program outcomes. Drawing from ...

…

Article

Adversarial risk analysis (ARA) offers a new solution concept in game theory. This paper explores its application to a range of simple gambling games, enabling comparison with minimax solutions for similar problems. We find that ARA has several attractive ...

…

Article

Adversarial risk analysis (ARA) offers a new solution concept in game theory. This paper explores its application to a range of simple gambling games, enabling comparison with minimax solutions for similar problems. We find that ARA has several attractive ...

…

Article

Adversarial risk analysis (ARA) offers a new solution concept in game theory. This paper explores its application to a range of simple gambling games, enabling comparison with minimax solutions for similar problems. We find that ARA has several attractive advantages: it is easier to compute, it takes account of asymmetric information, it corresponds better to human behavior, and it reduces to previous solutions in appropriate circumstances. Copyright © 2011 John Wiley & Sons, Ltd.

…

Article

Adversarial risk analysis (ARA) offers a new solution concept in game theory. This paper explores its application to a range of simple gambling games, enabling comparison with minimax solutions for similar problems. We find that ARA has several attractive ...

…

Article

The future aspects of advertizing response models with managerial impact are discussed. Brand equity is a simplifying concept since it allows the long term to be structured into a series of short-terms adjusted by the change in brand equity at the transitions between time periods. Advertizing does not directly change sales nor share and any other performance metric, but if it has any effect at all, changes brand equity which in turn may change subsequent performance. Managerial relevance requires to use the performance metrices that practitioners use, such as shareholder value, whereas some of the metrices may owe more to their availability for journal article purposes than to their intrisic value for managers.

…

Article

This paper discusses recent advances in advertising response models and identifies new opportunities for managerially relevant research. First, it establishes that recent research has shifted attention from topics such as duration of advertising effects in mature markets and short-term advertising elasticities to issues such as combined effects of ad content and weight and effectiveness in evolving markets. Then, motivated from recent trends in advertising practice, it presents a research agenda consisting of four main topics: (1) new media and forms of advertising (e.g. product placement), (2) media synergies, (3) advertising productivity and (4) advertising effects on performance stability. Copyright © 2005 John Wiley & Sons, Ltd.

…

Article

In this paper, we study the expectation of aggregate dividends until ruin for a Sparre Andersen risk process perturbed by diffusion under a threshold strategy, in which claim waiting times have a common generalized Erlang(n) distribution. For this strategy, we assume that if the surplus is above certain threshold level before ruin, dividends are continuously paid at a constant rate that does not exceed the premium rate, and if not, no dividends are paid. We obtain some integro-differential equations satisfied by the expected discounted dividends, and further its renewal equations. Finally, applying these results to the Erlang(2) risk model perturbed by diffusion, where claims have a common exponential distributions, we give some explicit expressions and numerical analysis. Copyright © 2007 John Wiley & Sons, Ltd.

…

Article

In this paper we investigate the effects of temporal aggregation of a class of Markov-switching models known as Markov-switching normal (MSN) models. The growing popularity of the MSN processes in modelling financial returns can be attributed to their inherited flexibility characteristics, allowing for heteroscedasticity, asymmetry and excess kurtosis. The distributions of the process described by the basic MSN model and the model of the corresponding temporal aggregate data are derived. They belong to a general class of mixture normal distributions. The limiting behaviour of the aggregated MSN model, as the order of aggregation tends to infinity, is studied. We provide explicit formulae for the volatility, autocovariance, skewness and kurtosis of the aggregated processes. An application of measuring solvency risk with MSN models for horizons larger than 1 year and up to 10 years from the baseline U.S. S&P 500 stock market total return time series spanning about 50 years is given. Copyright © 2008 John Wiley & Sons, Ltd.

…

Article

The BMOM is particularly useful for obtaining post-data moments and densities for parameters and future observations when the form of the likelihood function is unknown and thus a traditional Bayesian approach cannot be used. Also, even when the form of the likelihood is assumed known, in time series problems it is sometimes difficult to formulate an appropriate prior density. Here, we show how the BMOM approach can be used in two, nontraditional problems. The first one is conditional forecasting in regression and time series autoregressive models. Specifically, it is shown that when forecasting disaggregated data (say quarterly data) and given aggregate constraints (say in terms of annual data) it is possible to apply a Bayesian approach to derive conditional forecasts in the multiple regression model. The types of constraints (conditioning) usually considered are that the sum, or the average, of the forecasts equals a given value. This kind of condition can be applied to forecasting quarterly values whose sum must be equal to a given annual value. Analogous results are obtained for AR(p) models. The second problem we analyse is the issue of aggregation and disaggregation of data in relation to predictive precision and modelling. Predictive densities are derived for future aggregate values by means of the BMOM based on a model for disaggregated data. They are then compared with those derived based on aggregated data. Copyright © 2006 John Wiley & Sons, Ltd.

…

Article

A new notion of bivariate aging in the competitive risks framework is introduced. Aging properties of bivariate distributions are defined by the aging properties of marginals and by the aging properties of the corresponding dependence structure. The dependence structure is defined explicitly via the bivariate generalization of the corresponding univariate exponential representation. The new notion of positive (negative) hazard rates dependence is considered. This type of bivariate dependence is weaker than the well-known positive (negative) quadrant dependence. Sufficient conditions for the weak IFR aging (weak DFR negative aging) are derived and a number of simple but meaningful examples are considered. The generalization to the multivariate (n>2) case is briefly discussed.

…

Article

This paper considers an aging multi-state system, where the system failure rate varies with time. After any failure, maintenance is performed by an external repair team. Repair rate and cost of each repair are determined by a corresponding corrective maintenance contract with a repair team. The service market can provide different kinds of maintenance contracts to the system owner, which also can be changed after each specified time period. The owner of the system would like to determine a series of repair contracts during the system life cycle in order to minimize the total expected cost while satisfying the system availability. Operating cost, repair cost and penalty cost for system failures should be taken into account. The paper proposes a method for determining such optimal series of maintenance contracts. The method is based on the piecewise constant approximation for an increasing failure rate function in order to assess lower and upper bounds of the total expected cost and system availability by using Markov models. The genetic algorithm is used as the optimization technique. Numerical example is presented to illustrate the approach. Copyright © 2009 John Wiley & Sons, Ltd.

…

Top-cited authors