# Value at Risk: The New Benchmark for Managing Financial Risk

**Article**· January 2000

*with*1,820 Reads

- ... Since the problem of quantifying the risk existed, the VaR as a method for measuring risk was put into practice and proposed in details by [1]. The worst loss that can occur over a given period of time in a given level of confidence is defined as Value-at-Risk according to [2]. ...... A study by [2], noted that the prices of financial assets often react more to "bad news than good news" and such condition contributes to leverage. [3], quantified the effects of good and bad news on volatility and found out that there was asymmetry in the volatility of stock markets. ...... The worst loss that can occur over a given period of time in a given level of confidence is defined as Value-at-Risk according to according to [2]. ...
- ... A bank is exposed to asset-liquidity risk when a transaction cannot be executed at the prevailing market prices, which could be a consequence of the size of the position relative to the normal trading lot size. Funding liquidity risk refers to the inability to meet cash flow obligations, and is also known as cash flow risk (Jorion 2007). Banks are required to establish a robust liquidity risk management framework that would ensure sufficient liquidity is maintained, including the ability to withstand a range of stress events. ...... Risk can be measured by the standard deviation of unexpected outcomes, also called volatility. Value at Risk (VAR) calculates the worst loss over a target horizon that will not be exceeded with a given level of confidence and captures the combined effect of underlying volatility and exposure to financial risks (Jorion 2007). Volatility forecasting in the financial markets is important in the areas of risk management and asset pricing, among others. ...ArticleFull-text available
- Mar 2019

There is an increasing influence of machine learning in business applications, with many solutions already implemented and many more being explored. Since the global financial crisis, risk management in banks has gained more prominence, and there has been a constant focus around how risks are being detected, measured, reported and managed. Considerable research in academia and industry has focused on the developments in banking and risk management and the current and emerging challenges. This paper, through a review of the available literature seeks to analyse and evaluate machine-learning techniques that have been researched in the context of banking risk management, and to identify areas or problems in risk management that have been inadequately explored and are potential areas for further research. The review has shown that the application of machine learning in the management of banking risks such as credit risk, market risk, operational risk and liquidity risk has been explored; however, it doesn’t appear commensurate with the current industry level of focus on both risk management and machine learning. A large number of areas remain in bank risk management that could significantly benefit from the study of how machine learning can be applied to address specific problems. - ... According to Jorion [3], there are two approaches in VaR estimation: the parametric and non-parametric. The parametric approach essentially uses probability distribution in estimating VaR. ...... Jorion [3] states that two methods are generally used in estimating VaR with the parametric approach. The first is the delta-normal, or variance-covariance. ...
- ... Here, we use the Value-at-Risk (VaR) approach, which considers risk by setting a minimum threshold of return that the decision maker wishes to achieve in a worst case scenario. This threshold and therefore the decision maker's level of risk aversion are represented by a quantile of the distribution of returns, expressed as the α-value, The Value-at-Risk is calculated as the (1-α)-quantile of the inverse distribution function of F R (Jorion, 2009;Jorion, 1997). The VaR at a certain confidence level is called the α-Value-at-Risk, for example 95% VaR ( Andersson et al., 2001;Rockafellar and Uryasev, 2000). ...... The α-values commonly used are 99%, 95% and 90%. A 99% VaR represents a very risk-averse decision maker, for example in industrial production (Jorion, 2009). The 95% and 90% VaR express a medium and low risk aversion. ...Article
- Apr 2019
- FOREST POLICY ECON

Productivity in mixed forest stands is often higher than that in pure forest stands. Economic analyses usually exclude this phenomenon. In our study, we assess the consequences of an increased productivity in mixed forests for the economically optimal proportions of tree species in forest stands. The economic model applied centers around the Modern Portfolio Theory with the Value-at-Risk as the objective function. The methodological approach encompasses a structured Monte Carlo simulation to generate distributions of returns for different scenarios. Risks include price fluctuations for raw wood as well as natural hazards. The study's major novelty is that it combines well-researched productivity relations in mixed forests with recent empirical survival models for mixed and pure stands to address site variability across Bavaria. We parametrized the model with simulated growth data for Norway spruce and European beech for various climatic and geological conditions. From the results, we conclude that decision makers can increase the Value-at-Risk of forest portfolios when overyielding occurs, by choosing higher shares of beech. Climatic conditions and the extent of overyielding strongly influence optimal stand composition. Using Value-at-Risk as a risk measure that regards both expected returns and their variation, however, leads to the observation that increased risk aversion rather favors higher shares of spruce in forest stands, despite its higher biophysical risks. https://authors.elsevier.com/a/1Ytmk_Vcdiy3D2 - ... Depending on the predicted price, there are situations when the partload at which the expected savings are maximized involves the highest risk. In order to quantify this risk, a popular measure in financial analysis is the Value at Risk (VaR), which represents a breakpoint; it is the worst-case result of a decision associated with a probability [36]. It is usually used to represent the worst-case losses that can be expected for a given level of confidence í µí±. ...Conference PaperFull-text available
- Jun 2019

This work is concerned with the integration and coordination of decentralized combined heat and power (CHP) systems in commercial buildings. Although extensive research has been performed on theoretically optimizing the design, sizing and operation of CHP systems, less effort has been devoted to an understanding of the practical challenges and the effects of uncertainty in implementing advanced algorithms to real-world applications. This paper provides details of an undergoing field trial involving the installation of a dynamic controller for the optimal operation of an existing CHP engine, which provides electricity and heat to a supermarket. The challenges in developing and applying an optimization framework and the software architecture required to implement it are discussed. Deterministic approaches that involve no measure of uncertainty provide limited useful insight to decision makers. For this reason, the methodology here develops a stochastic programming technique, which performs Monte Carlo simulations that can consider the uncertainty related to the exporting electricity price. The method involves the formation of a bi-objective function that represents a compromise between maximizing the expected savings and minimizing the associated risk. The results reveal a risk-return trade-off, demonstrating that conservative operation choices emerging from the stochastic approach can reduce risk by about 15% at the expense of a noticeably smaller reduction of about 10% in expected savings. - ... In view of the aforementioned difficulties, current market practice is to calculate VaRs via Greek approximations such as the delta-normal and delta-gamma approximations; see Jorion (2006). Performance of these approaches can sometimes be disappointing. ...PreprintFull-text available
- Apr 2019

Value-at-risk (VaR) has been playing the role of a standard risk measure since its introduction. In practice, the delta-normal approach is usually adopted to approximate the VaR of portfolios with option positions. Its effectiveness, however, substantially diminishes when the portfolios concerned involve a high dimension of derivative positions with nonlinear payoffs; lack of closed form pricing solution for these potentially highly correlated, American-style derivatives further complicates the problem. This paper proposes a generic simulation-based algorithm for VaR estimation that can be easily applied to any existing procedures. Our proposal leverages cross-sectional information and applies variable selection techniques to simplify the existing simulation framework. Asymptotic properties of the new approach demonstrate faster convergence due to the additional model selection component introduced. We have also performed sets of numerical results that verify the effectiveness of our approach in comparison with some existing strategies. - ... According to Jorion, (2007), banks and financial institutions have faced financial risks. The banking business is one of the most vulnerable companies due to interest rate and exchange rate fluctuations. ...ArticleFull-text available
- Feb 2019

Banks between the financial services they provide play significant roles in the country's economy The importance of banks in Kosovo is one of the essential catalysts in economic growth. The banking industry based on efficiency and performance industryis the leading indicator of the country's financial stability The pace of economic growth and long-term stability in the country varies from the level of credit and for what economic activities the bank finances. Credit risk is the primary determinant of banking performance. The higher the risk that the higher the risk is the probability of bank loss and vice versa In this study banking activities will be discussed and events in general, as well as an analysis of the financial system especially at banks, with particular emphasis on the importance of credit risk management. - ... However, the quantile makes the optimization Problem (4) hard to solve. Note that Q γ (P d ) is also called γ-Value at Risk (VaR), an important risk measure in finance (Jorion (2001)). In order to address the shortcomings such as the discouragement for diversification, Artzner et al. (1999) studied an alternative risk measure called Conditional Value at Risk (CVaR), also known as the expected shortfall, average value at risk or expected tail loss. ...Preprint
- Mar 2019

With the emergence of precision medicine, estimating optimal individualized decision rules (IDRs) has attracted tremendous attentions in many scientific areas. Most existing literature has focused on finding optimal IDRs that can maximize the expected outcome for each individual. Motivated by complex individualized decision making procedures and popular conditional value at risk (CVaR) measures, we propose two new robust criteria to estimate optimal IDRs: one is to control the average lower tail of the subjects' outcomes and the other is to control the individualized lower tail of each subject's outcome. In addition to optimizing the individualized expected outcome, our proposed criteria take risks into consideration, and thus the resulting IDRs can prevent adverse events caused by the heavy lower tail of the outcome distribution. Interestingly, from the perspective of duality theory, the optimal IDR under our criteria can be interpreted as the decision rule that maximizes the "worst-case" scenario of the individualized outcome within a probability constrained set. The corresponding estimating procedures are implemented using two proposed efficient non-convex optimization algorithms, which are based on the recent developments of difference-of-convex (DC) and majorization-minimization (MM) algorithms that can be shown to converge to the sharpest stationary points of the criteria. We provide a comprehensive statistical analysis for our estimated optimal IDRs under the proposed criteria such as consistency and finite sample error bounds. Simulation studies and a real data application are used to further demonstrate the robust performance of our methods. - ... Minimizing the expected longest time to first detection over the infestation scenarios requires controlling the right tail of the distribution of detection times with a percentile metric that characterises the expected tail value (Jorion, 2006;Studer, 1997). In particular, value-atrisk (VaR) and conditional value-at-risk (CVaR) are used in the finance field to evaluate risk of extreme losses (Acerbi and Tasche, 2002;Duffie and Pan, 1997;Inui and Kijima, 2005;Uryasev, 2000, 2002). ...Article
- Jun 2019
- ECOL ECON

Surveillance programs to detect alien invasive pests seek to find them as soon as possible, but also to minimize the cost of damage from invasion. To examine the trade-offs between these objectives, we developed an economic model that allocates survey sites to minimize either the expected mitigation costs or the expected time until first detection of an invasive alien pest subject to a budget constraint on surveillance costs. We also examined strategies preferred by ambiguity-averse decision makers that minimize the expected and worst-case outcomes of each performance measure. We applied the model to the problem of detecting Asian longhorned beetle (Anoplophora glabripennis) in the Greater Toronto Area, Canada, one of the most harmful invasive alien insects in North America. When minimizing expected mitigation costs or expected time to detection, the trade-off between these survey objectives was small. Strategies that minimize the worst-case mitigation costs differed sharply and surveyed sites with high host densities using high sampling intensities whereas strategies that minimize the worst detection times surveyed sites across the entire area using low sampling intensities. Our results suggest that preferences for minimizing mitigation costs or time to detection are more consequential for ambiguity-averse managers than they are for risk-neutral decision-makers. - ... The main idea of the model is to find out the probability of bankrupt when the surplus level is less than zero [4]. Value at Risk (VaR) constitutes also a method to measure the risk of investment losses based on probability theory and mathematical statistics [8]. However, VaR only focuses on the statistical characteristics of risk instead of the whole of system risk management. ...Conference Paper
- Apr 2019

The sharing economy has been widely concerned since its inception and bike sharing is one of the most representative examples. The paper attempts to investigate the cashflow management strategy of bike sharing companies to optimize the overall financial return. The framework of our model is based on the assumption that bike sharing companies may invest operation income in financial market for long-term and short-term earnings. Optimal reserve pool is mod-eled and estimated using double parameterized compound Poisson distributions. Empirical examples and Monte Carlo analysis are provided for model validation. - ... In commercial applications, VaR has been a widely used risk metric. The 1988 Basel Com- mittee signed by G-10 countries imposed risk management on banks, which have motivated the use of some VaR methodol- ogy to comply with regulations (Jorion, 2001). Several United States government agencies such as the Federal Reserve and the Security and Exchange Commision (SEC) use or advocate the use of VaR ( Khindanova et al., 2000). ...ArticleFull-text available
- Feb 2019

In this paper, we explored market risk in Brazil by considering different sec-toral indices of the Brazilian stock market and the GARCH Value-at-Risk (GARCH-VaR) estimation approach. We have carried a statistical evaluation of eight Brazilian sectoral stock indices during different time ranges so that GARCH-VaR methodologies could be chosen according to the data. We analyzed the sectoral indices in a Value-at-Risk point of view using recent data. The results of the study reveal that VaR may be an effective tool on minimizing risk exposure and potentially to avoid losses when trading on the Brazilian stock market. Furthermore, we showed that different sectors of the Brazilian economy have significantly different risk behavior. In particular , the consumption and industrial sectoral indices presented the best risk performance. In this sense, we highlight that this type of analysis would be useful to small investors in evaluating the attractiveness of investing on the Brazilian stock market. - ... Value-at-Risk (VaR), initially proposed by JP Morgan, is a method for risk estimation used in financial markets (Jorion 2007). It is an alternative risk estimator that can quantify the volatility among different quantiles, i.e. from ith quantile probability distribution functions. ...Article
- Apr 2019
- EMERG MARK FINANC TR

We analyzed the effects of inflation targeting (IT) implementation and functioning through the reaction function of monetary authorities from Latin American (LA) inflation targeters (ITers), e.g. Brazil, Chile, Colombia, Mexico, and Peru. We adapted the Value-at-Risk (VaR) and CoVaR to the Inflation-at-Risk (IaR) and Co-Inflation-at-Risk (CoIaR), respectively, to estimate the inflation at the extremes of its probability density functions. The results suggested that the IT was able to reduce inflation risk for all ITers. Chile and Peru are further ahead in terms of inflationary control, whereas in Brazil, it is more difficult. We propose the IaR and CoIaR as additional risk-management tools. - ... We can use D 98%,90% as a target minimum dose measure which is consistent with the common margin recipe. A mathematically equivalent measure to PD is the Value at Risk (VaR) that has been used in stock portfolio management to estimate the maximum loss of the Q th percentile worst outcome of the portfolio [7]. However, VaR and PD yields non-convex optimization problems [8]. ...ArticleFull-text available
- Apr 2019

- ... (Derivatives will be called options from now on for brevity.) They are essential for pricing, speculation, risk management, and model calibration (Cui et al. 2017, Hull 2014, Jorion 2001. Studies on hedging with Greeks and their calculations can be found in Dumas et al. (1998), Pelsser and Vorst (1994) and Bernis et al. (2003). ...Article
- Jan 2019
- QUANT FINANC

Greeks are the price sensitivities of financial derivatives and are essential for pricing, speculation, risk management, and model calibration. Although the pathwise method has been popular for calculating them, its applicability is problematic when the integrand is discontinuous. To tackle this problem, this paper defines and derives the parameter derivative of a discontinuous integrand of certain functional forms with respect to the parameter of interest. The parameter derivative is such that its integration equals the differentiation of the integration of the aforesaid discontinuous integrand with respect to that parameter. As a result, unbiased Greek formulas for a very broad class of payoff functions and models can be systematically derived. This new method is applied to the Greeks of (1) Asian options under two popular Lévy processes, i.e. Merton's jump-diffusion model and the variance-gamma process, and (2) collateralized debt obligations under the Gaussian copula model. Our Greeks outperform the finite-difference and likelihood ratio methods in terms of accuracy, variance, and computation time. - ... The minmax formulation minimizes the damage of the worst outcome but it may not minimize the expected value of the right tail of the damage distribution. Instead, we use percentile-based metrics which offer better control of the expected tail value, such as value-at-risk (Studer, 1997;Jorion, 2006) and conditional tail expectation (CTE) or conditional valueat-risk, (CVaR). Percentile metrics have been widely used to assess extreme losses in finance (e.g. ...PreprintFull-text available
- Jan 2019

We develop an acceptance sampling approach for surveillance of the emerald ash borer (EAB), a harmful forest pest, in Winnipeg, Canada. We compare sampling strategies computed with two different management objectives. The first objective maximizes the expected area with detected infestations and the second objective minimizes the expected number of undetected infested trees in sites that were not inspected or where inspection did not find an infestation. The choice of the management objective influences the survey strategy: achieving the first objective involves selecting sites with high infestation rates proximal to the infested area, whereas the second objective requires inspecting sites with both high infestation rates and high host densities. Adding uncertainty prescribes inspecting a larger area with lower sampling rates and extending the surveys to farther distances from the infested locations. If a decision maker wants to minimize the worst-case damage from failed detections, the optimal strategy is to survey more sites with high host densities at farther distances, where EAB arrivals could cause significant damage if not detected quickly. Accounting for the uncertainty addresses possible variation in infestation rates and helps develop a more diversified survey strategy. The approach is generalizable and can support survey programmes for new pest incursions. - ... In this case, it is required to compare several random variables synthesized through their percentiles and statistical moments. Several approaches have been proposed to this end, such as a simple comparison of the expected value, the expected utility(Von Neumann & Morgenstern, 1947), the use of low order moments(Markowitz, 1952), risk measures(Jorion, 2007;Mansini, Ogryczak, & Speranza, 2007;Rockafellar & Uryasev, 2000), the Partitioned Multiobjective Risk Method (PMRM;Asbeck & Haimes, 1984;Haimes 2009), and the stochastic dominance theory(Levy, 2006), among others.To consider the risk evaluation induced by uncertainty, each alternative is represented by the third synthetic attribute: compliance. This new attribute is based on a simultaneous assessment of several risk measures and some moments of each AQ distribution (Mun et al., 2016).At this point, each alternative is assessed from three different angles:1. ...Conference PaperFull-text available
- Jun 2019

This research has the explicit goal of proposing a reusable, extensible, adaptable, and comprehensive advanced analytical modeling process to help the U.S. Department of Defense (DOD) with risk-based capital budgeting and optimizing acquisitions and programs portfolios with multiple competing stakeholders while subject to budgetary, risk, schedule, and strategic constraints. The research covers topics of traditional capital budgeting methodologies used in industry, including the market, cost, and income approaches, and explains how some of these traditional methods can be applied in the DOD by using DOD-centric non-economic, logistic, readiness, capabilities, and requirements variables. Portfolio optimization for the purposes of selecting the best combination of programs and capabilities is also addressed, as are other alternative methods such as average ranking, risk metrics, lexicographic methods, PROMETHEE, ELECTRE, and others. Finally, an illustration at Program Executive Office Integrated Warfare Systems (PEO IWS) and Naval Sea Systems Command (NAVSEA) is presented to showcase how the methodologies can be applied to develop a comprehensive and analytically robust case study that senior leadership at the DOD may utilize to make optimal decisions. - ... The application of S&P 500 data to the new DCS models is useful, for example, for investors of (i) well-diversified US equity portfolios; (ii) S&P 500 futures and options contracts traded at the Chicago Mercantile Exchange (CME); (iii) exchange traded funds (ETFs) related to the S&P 500. For practitioners, the new DCS models with dynamic shape parameters may provide precise estimates and forecasts of (i) stock market volatility for pricing financial derivatives (Hull 2018), and (ii) other classical risk measurement metrics, such as value-at-risk (VaR) (Jorion, 2006) and expected shortfall (ES) ( Acharya et al., 2012Acharya et al., , 2017). ...PreprintFull-text available
- Jan 2019

We introduce new dynamic conditional score (DCS) volatility models with dynamic scale and shape parameters for the effective measurement of volatility. In the new models, we use the EGB2 (exponential generalized beta of the second kind), NIG (normal-inverse Gaussian) and Skew-Gent (skewed generalized-t) probability distributions. Those distributions involve several shape parameters that control the dynamic skewness, tail shape and peakedness of financial returns. We use daily return data from the Standard & Poor's 500 (S&P 500) index for the period of January 4, 1950 to December 30, 2017. We estimate all models by using the maximum likelihood (ML) method. We present new conditions for the asymptotic properties of the ML estimator, by extending the DCS literature. We study those conditions for the S&P 500 and we also perform diagnostic tests for the residuals. The statistical performances of several DCS specifications with dynamic shape are superior to the statistical performance of the DCS specification with constant shape. Outliers in the shape parameters are associated with important announcements that affected the United States (US) stock market. Our results motivate the application of the new DCS models to volatility measurement, pricing financial derivatives, or estimation of the value-at-risk (VaR) and expected shortfall (ES) metrics. - ... For example, there are many studies, which underline the importance of identifying the different kinds of risk in the organizations and developing suitable models for their assessment and management. Many researchers apply a research model focused on the economic functioning of market rules ( Bangia et al., 2001;Jorion, 2007;Silva, et al., 2018). This kind of approach could be inadequate for the evaluation of the public organization where there are other dimensions to take into account (Adams,et al.,. ...
- ... The reasoning behind such approach is to view quantile as a risk metric. For instance, one particularly interesting risk metric is Value-at-Risk (VaR) which has been in use for a few decades in Financial Industry (Philippe, 2006). Artzner et al. (1999 ...Preprint
- May 2019

In distributional reinforcement learning (RL), the estimated distribution of value function models both the parametric and intrinsic uncertainties. We propose a novel and efficient exploration method for deep RL that has two components. The first is a decaying schedule to suppress the intrinsic uncertainty. The second is an exploration bonus calculated from the upper quantiles of the learned distribution. In Atari 2600 games, our method outperforms QR-DQN in 12 out of 14 hard games (achieving 483 \% average gain across 49 games in cumulative rewards over QR-DQN with a big win in Venture). We also compared our algorithm with QR-DQN in a challenging 3D driving simulator (CARLA). Results show that our algorithm achieves near-optimal safety rewards twice faster than QRDQN. - ... PI. Optimization of the probability that the trajectory x(·) attains certain set. This approach and its refinements yield what is called "percentile premium", "value-at-risk" (see [23] and [24, Chapter 3.5.3]) or "risk premium" (see [32, Example 6.5 and Chapter 17]). ...PreprintFull-text available
- Apr 2019

This work presents a two-player extraction game where the random terminal times follow (different) heavy-tailed distributions which are not necessarily compactly supported. Besides, we delve on the implications of working with logarithmic utility/terminal payoff functions. To this end, we use standard actuarial results and notation, and state a connection between the so-called actuarial equivalence principle, and the feedback controllers found by means of the dynamic programming technique. Our conclusions include a conjecture on the form of the optimal premia for insuring the extraction tasks; and a comparison for the intensities of the extraction for each player under different phases of the lifetimes of their respective machineries. 2010 Mathematics Subject Classification: 91A10, 91A23, 49N90, 60E05 - ... The minmax formulation minimizes the damage of the worst outcome but it may not minimize the expected value of the right tail of the damage distribution. Instead, we use percentile-based metrics which offer better control of the expected tail value, such as value-at-risk (Studer, 1997;Jorion, 2006) and conditional tail expectation (CTE) or conditional valueat-risk, (CVaR). Percentile metrics have been widely used to assess extreme losses in finance (e.g. ...We develop an acceptance sampling approach for surveillance of the emerald ash borer (EAB), a harmful forest pest, in Winnipeg, Canada. We compare sampling strategies computed with two different management objectives. The first objective maximizes the expected area with detected infestations and the second objective minimizes the expected number of undetected infested trees in sites that were not inspected or where inspection did not find an infestation. The choice of the management objective influences the survey strategy: achieving the first objective involves selecting sites with high infestation rates proximal to the infested area, whereas the second objective requires inspecting sites with both high infestation rates and high host densities. Adding uncertainty prescribes inspecting a larger area with lower sampling rates and extending the surveys to farther distances from the infested locations. If a decision maker wants to minimize the worst-case damage from failed detections, the optimal strategy is to survey more sites with high host densities at farther distances, where EAB arrivals could cause significant damage if not detected quickly. Accounting for the uncertainty addresses possible variation in infestation rates and helps develop a more diversified survey strategy. The approach is generalizable and can support survey programmes for new pest incursions.
- ... Daily data are used to measure expected and abnormal returns. We use daily log returns because these are convenient for multiperiod returns (Campbel, Lo, and MacKinlay, 1996), a better median is derived when forecasting future cumulative returns (Hughson, Stutzer, and Yung, 2006), and the use of the logarithms of returns avoids negative security prices in security return models (Jorion, 2001). As stated above, a suitable event window to test for the effect of credit rating changes on security prices is around 11 days (Corwin and Lipson, 2000;Kryzanowski and Nemiroff, 2001;Choy et al., 2006;Brune and Liu, 2010), that is, from day T-3 to day T+10. ...ArticleFull-text available
- Feb 2019

This study investigates whether a change in credit ratings lead to a change in daily excess stock returns. The sample includes daily stock price data for US firms listed on the Standard & Poor’s 500 from January 2006 to December 2015. Firms’ excess stock returns are compared with the market in a 14-day window around credit rating downgrades and upgrades. Our results are asymmetric, that is, there is a significant reaction to credit ratings downgrades but not to upgrades. In addition, we report weak evidence of upgrades in credit ratings since the 2008 global credit crisis leading to significant changes in security prices. - Article
- Feb 2019
- OZONE-SCI ENG

The generalized extreme value distribution, generalized Pareto distribution, Pareto type-I distribution, Pareto type-II distribution, Burr distribution, log-logistic distribution, Fréchet distribution, log-normal distribution, log-Cauchy distribution, Lévy distribution, and Dagum distribution are fitted to four data-sets consisting of surface-level ozone for Weaverville, California (WVR); Tundra Lab, Niwot Ridge, Colorado (TUN); South Pole, Antarctica (SPO); and Mauna Loa, Hawaii (MLO) sites in the USA. Based on the probability distribution with the smallest Akaike information criterion, Bayesian information criterion, corrected Akaike information criterion, mean square error, mean absolute deviation, and maximum deviation values and probability-probability plots, the Dagum distribution emerged as the best fitting distribution for Weaverville, the Burr distribution emerged as the best fitting distribution for Niwot Ridge and Mauna Loa, and the log-Cauchy distribution emerged as the best fitting distribution for the South Pole site. The value at risk for each of the ozone sites is given and a backtest based on the Kupiec test is used to verify the appropriateness of the selected distributions. - Article
- Mar 2019
- ANN FOR SCI

Key message Economic consequences of altered survival probabilities under climate change should be considered for regeneration planning in Southeast Germany. Findings suggest that species compositions of mixed stands obtained from continuous optimization may buffer but not completely mitigate economic consequences. Mixed stands of Norway spruce ( Picea abies L. Karst . ) and European beech ( Fagus sylvatica L.) (considering biophysical interactions between tree species) were found to be more robust, against both perturbations in survival probabilities and economic input variables, compared to block mixtures (excluding biophysical interactions). Context Climate change is expected to increase natural hazards in European forests. Uncertainty in expected tree mortality and resulting potential economic consequences complicate regeneration decisions. Aims This study aims to analyze the economic consequences of altered survival probabilities for mixing Norway spruce (Picea abies L. Karst.) and European beech (Fagus sylvatica L.) under different climate change scenarios. We investigate whether management strategies such as species selection and type of mixture (mixed stands vs. block mixture) could mitigate adverse financial effects of climate change. Methods The bio-economic modelling approach combines a parametric survival model with modern portfolio theory. We estimate the economically optimal species mix under climate change, accounting for the biophysical and economic effects of tree mixtures. The approach is demonstrated using an example from Southeast Germany. Results The optimal tree species mixtures under simulated climate change effects could buffer but not completely mitigate undesirable economic consequences. Even under optimally mixed forest stands, the risk-adjusted economic value decreased by 28%. Mixed stands economically outperform block mixtures for all climate scenarios. Conclusion Our results underline the importance of mixed stands to mitigate the economic consequences of climate change. Mechanistic bio-economic models help to understand consequences of uncertain input variables and to design purposeful adaptation strategies. - Article
- Jan 2019
- JASSS-J ARTIF SOC S

The financial system is inherently procyclical, as it amplifies the course of economic cycles, and precisely one of the factors that has been suggested to exacerbate this procyclicality is the Basel regulation on capital requirements. After the recent credit crisis, international regulators have turned their eyes to countercyclical regulation as a solution to avoid similar episodes in the future. Countercyclical regulation aims at preventing excessive risk taking during booms to reduce the impact of losses suffered during recessions, for example increasing the capital requirements during the good times to improve the resilience of financial institutions at the downturn. The Basel Committee has already moved forward towards the adoption of countercyclical measures on a global scale: the Basel III Accord, published in December 2010, revises considerably the capital requirement rules to reduce their procyclicality. These new countercyclical measures will not be completely implemented until 2019, so their impact cannot be evaluated yet, and it is a crucial question whether they will be effective in reducing procyclicality and the appearance of crisis episodes such as the one experienced in 2007-08. For this reason, we present in this article an agent-based model aimed at analysing the effect of two countercyclical mechanisms introduced in Basel III: the countercyclical buffer and the stressed VaR. In particular, we focus on the impact of these mechanisms on the procyclicality induced by market risk requirements and, more specifically, by value-at-risk models, as it is a issue of crucial importance that has received scant attention in the modeling literature. The simulation results suggest that the adoption of both of these countercyclical measures improves market stability and reduces the emergence of crisis episodes. - ArticleFull-text available
- Jun 2018

In the literature, there is no consensus as to which Value-at-Risk forecasting model is the best for measuring market risk in banks. In the study an analysis of Value-at-Risk forecasting model quality over varying economic stability periods for main indices from stock exchanges was conducted. The VaR forecasts from GARCH(1,1), GARCH-t(1,1), GARCH-st(1,1), QML-GARCH(1,1), CAViaR and historical simulation models in periods with contrasting volatility trends (increasing, constantly high and decreasing) for countries economically developed (the USA – S&P 500, Germany - DAX and Japan – Nikkei 225) and economically developing (China – SSE COMP, Poland – WIG20 and Turkey – XU100) were compared. The data samples used in the analysis were selected from the period 01.01.1999 – 24.03.2017. To assess the VaR forecast quality: excess ratio, Basel traffic light test, coverage tests (Kupiec test, Christoffersen test), Dynamic Quantile test, cost functions and Diebold-Marino test were used. Obtained results show that the quality of Value-at-Risk forecasts for the models varies depending on a volatility trend. However, GARCH-st (1,1) and QML-GARCH(1,1) were found to be the most robust models in the different volatility periods. The results show as well that the CAViaR model forecasts were less appropriate in the increasing volatility period. Moreover, no significant differences for the VaR forecast quality were found for the developed and developing countries. - Conference Paper
- Dec 2018

- Article
- Jan 2019

This paper provides a general valuation model to fairly price a European option using parametric and non-parametric methods. In particular, we show how to use the historical simulation (HS) method, a well-known non-parametric statistical method applied in the financial area, to price an option. The advantage of the HS method is that one can directly obtain the distribution of stock returns from historical market data. Thus, it not only does a good job in capturing any characteristics of the return distribution, such as clustering and fat tails, but it also eliminates the model errors created by mis-specifying the distribution of underlying assets. To solve the problem of measuring transformation in valuing options, we use the Esscher’s transform to convert the physical probability measure to the forward probability measure. Taiwanese put and call options are used to illustrate the application of this method. To clearly show which model prices stock options most accurately, we compare the pricing errors from the HS method with those from the Black–Scholes (BS) model. The results show that the HS model is more accurate than the BS model, regardless for call or put options. More importantly, because there is no complex mathematical theory underlying the HS method, it can easily be applied in practice and help market participants manage complicated portfolios effectively. - ArticleFull-text available
- Jun 2015

Stock market portfolio management has remained successful in drawing attention of number of researchers from the fields of computer science, finance and mathematics all around the world since years. Successfully managing stock market portfolio is the prime concern for investors and fund managers in the financial markets. This paper is aimed to provide a walk-through to the stock market portfolio management. This paper deals with questions like what is stock market portfolio, how to manage it, what are the objectives behind managing it, what are the challenges in managing it. As each coin has two sides, each portfolio has two elements – risk and return. Regarding this, Markowitz’s Modern Portfolio Theory, or Risk-Return Model, to manage portfolio is analyzed in detail along with its criticisms, efficient frontier, and suggested state-of-the-art enhancements in terms of various constraints and risk measures. This paper also discusses other models to manage stock market portfolio such as Capital Asset Pricing Model (CAPM) and Arbitrage Pricing Theory (APT) Model. - Conference PaperFull-text available
- Sep 2018

Bu çalışmada ikame enerji kaynaklarından nükleer enerjiden elektrik üretiminin muhtemel ekonomik kazançları incelenecektir. Dünya elektrik üretiminin yaklaşık %11’inin nükleer enerjiden sağlanması ve halen gelişmiş ülkelerde nükleer reaktör inşalarının devam etmesi konunun önemini göstermektedir. Nükleer enerjinin riskleri ve nükleer atıkların zararlarının yanı sıra ekonomik kazançları da önem taşımaktadır. Türkiye’de yapımına başlanan Akkuyu Nükleer Güç Santralinin Türkiye’nin ekonomik büyümesine katkısını, Dünyada nükleer enerji kurulu güç verilerine göre ilk üç ülke olan ABD, Fransa ve Japonya örneğinde incelenecektir. Çalışmada bu üç ülkenin nükleer enerji üretimi ile büyümeleri arasında Granger nedensellik araştırılacaktır. - Article
- Mar 2019

The frequent global financial crises in recent years show that it is necessary to implement macroprudential regulation for the banking system. At present, quantitative research on the macroprudential regulation for the dynamic Chinese banking network system is lacking, while the related studies in other countries have not considered the interbank network structure. Therefore, in the present paper, we construct a dynamic banking network model with a scale-free network and a dynamic macroprudential regulation model under four risk allocation mechanisms (CVaR, Incremental VaR, Shapley value EL, and ΔCoVaR) for the dynamic Chinese banking network system. Then, we conduct empirical research to study the effect of the macroprudential regulation model on the Chinese banking network system. Our results show that the Chinese banking network system was the most unstable in 2010 and that the average default probability decreased every year after the macroprudential regulation, indicating the effectiveness of the macroprudential regulation model. From the perspective of the scale-free network structure, we find that the intrinsic mechanism of macroprudential regulation is to rewire the interbank linkages from small banks to large banks with more interbank lending to prevent contagious risk, thereby improving the stability of the entire banking system. Moreover, the regulation effects of ΔCoVaR and CVaR mechanisms are found to be better than those of the other mechanisms. The regulation effect of ΔCoVaR is the most significant. - ArticleFull-text available
- Feb 2019

Companies implementing R&D projects face their unique features. There is the need for large capital investments, long-term implementation, high growth potential, low probability of success, and diffculties in fnancing among them. Implementation of such projects is associated with high risks. This leads to underfunding as uncertain results deter investors. The problem of assessing the risks arising from the implementation of such projects has not yet been suffciently studied at the level of mathematical analysis models. The objective of the article is to develop a model allowing to explore the risks arising from implementing R&D projects. The author has developed a risk assessment model using the VaR measure modifed for this application. The formulas have been obtained to calculate this measure. They have been adjusted to simple analytical expressions assuming the balanced distribution of cash ﬂow from the project, or triangular distribution. The model considers the most important causes of risks in R&D projects. It can be used in a real-case scenario if a preliminary risk assessment of a project is done before its implementation and a decision is made on risk-based implementation. Moreover, this methodology can be used to standardize the decision-making process for the R&D projects implementation considering the “risk appetite” using the VaR risk measure. - ArticleFull-text available
- Jan 2018

- Chapter
- Jun 2019

The Value-at-Risk (VaR) is probably the most known measure for quantifying and controlling the risk of a portfolio. The establishment of VaR is of central importance to a credit institute, since it is the basis for a regulatory notification technique and for required equity investments. - Chapter
- Jun 2019

Risk-sensitive decision-Yoshida,Yuji with constraints of coherent risk measures is discussed. Risk-sensitive expected rewards under utility functions are approximated by weighted average value-at-risks, and risk constraints are described by coherent risk measures. In this paper, coherent risk measures are represented as weighted average value-at-risks with the best risk spectrum derived from decision-maker’s risk averse utility, and the risk spectrum can inherit the risk averse property of the decision-maker’s utility as weighting. To find feasible regions, first, a dynamic risk-minimizing problem is discussed by mathematical programming. Next, a risk-sensitive reward maximization problem under the feasible coherent risk constraints is demonstrated. A few numerical examples are given to understand the obtained results. - Chapter
- Jan 2020

In this chapter we discuss methods for updating and managing project contingency as the outcomes work packages become known. More specifically we provide bivariate and multivariate formulation with a number of examples to illustrate different situations in which the presented methods can be implemented. - Chapter
- May 2019

The aim of this paper is to investigate the relation between crowdfunding platforms and risk management in the Italian context, in order to assess the impact for participants. This study can be divided in two main parts. The first part analyses the main characteristics and types of crowdfunding and the current regulatory state for the Italian markets. The second part focuses specifically on risk management theory and presents some specific risks in equity crowdfunding. The findings of this research highlight potential threats and risks for investors and introduce some risk management approaches in crowdfunding platforms. To the authors’ knowledge, this is one of the first study made to deeply understand crowdfunding risk assessment and how platforms can mitigate and manage related risks. - Article
- May 2019

The minimization of the probability of ruin is a crucial criterion for determining the effect of the form of reinsurance on the wealth of the cedant and is a very important factor in choosing optimal reinsurance. However, this optimization criterion alone does not generally lead to a rational decision to choose an optimal reinsurance plan. This criterion acts only on the risk (minimizing it via the probability of ruin), but does not affect the technical benefit, that is to say, the insurer should not choose the optimal reinsurance treaty, it is not beneficial.We propose a new reinsurance optimization strategy that maximizes the technical benefit of an insurance company while maintaining a minimal level for the probability of ruin. The objective is to optimize with precision and ease of computation using Genetic algorithms. - Chapter
- Apr 2019

Risk is an integral part of our lives. When it comes to investments, risks are unavoidable. Gold, real estate, equity prices constantly move up and down. If you seek little or no risk, returns could be high. However, financial planners believe managing risk is essential, and one of the best products to manage it, is mutual funds. - Chapter
- Apr 2019

After the fact of the Internet Bubble, the technology companies, especially the Internet companies, were characterized as a sector of greater risk when compared to the other consolidated sectors. Thus, the present study aims to analyze whether the market risk of companies in the internet sector is still higher than companies in consolidated sectors. For this comparison, the Value-at-Risk (VaR) risk management method was used, which summarizes, in a single number, the worst expected return within certain confidence intervals and time. This methodology was applied to two groups: internet companies, traded in NADASQ, and companies in consolidated sectors, such as consumer goods, manufacturing, financial services, among others, traded on the NYSE. Samples are divided between 2000–2007 and 2008–2014 periods to compare behavior over time. The final result suggests that Internet companies still had a higher market risk than firms in consolidated sectors, but this risk decreased substantially between the periods studied. - Chapter
- Mar 2019

A dynamic portfolio optimization model with average value-at-risks is discussed for drastic declines of asset prices. Analytical solutions for the optimization at each time are obtained by mathematical programming. By dynamic programming, an optimality equation for optimal average value-at-risks over time is derived. The optimal portfolios and the corresponding average value-at-risks are given as solutions of the optimality equation. A numerical example is given to understand the solutions and the results. - ArticleFull-text available
- Jul 2016

This study aimed to empirically compare the risk between sharia and non-sharia based stock investment. The Sharia stocks are refereed to stocks that issued by companies listed in LQ-45, whereas the non-sharia stocks are defined as stocks that are issued by companies listed in Jakarta Indonesia Index (JII) between 2011 and 2012. In total, there were 25 companies listed in LQ-45 and 15 companies listed in JII which were involved in this study. This study used GARCH model to estimate the risk of every individual stock. The result showed that there was a difference in risk between sharia and non-sharia based stock. This study also documented that non-Sharia based stocks were more risky than Sharia-based stodcks. Finally, this study provides information on risk characteristic in Indonesia Capital Market. - Chapter
- Apr 2009

The existence of heterogeneous investors and its incidence in gauging value at risk (VaR) have been previously dealt with in the literature in the context of the capital asset pricing model (CAPM), e.g., Fernandez (2005; 2006). In this chapter, in addition to the heterogeneity of investment horizons, we take account of spatial interrelations in financial markets by means of a spatial version of the CAPM (S-CAPM). This way, we can accommodate for firms characteristics in terms of a metric distance, which allows us to quantify systematic and nonsystematic risks under financial linkages across firms. A measure of VaR is formulated from the S-CAPM, which also takes into consideration investors’ heterogeneity. We illustrate the use of our methodology by means of a panel of Latin American firms. In addition, we complement the discussion with Monte Carlo simulations aimed at quantifying the benefit of diversification in terms of VaR reduction. - ArticleFull-text available
- Mar 2019

Fund performance measurement studies in the past have been of great interest for academicians, fund managers and general investors. Assessment of risk involved in investment has been at the center of all investment decisions. Treating risk as a whole and always taking it to be on the negative side of investments, is not warranted. Since, the risk has two elements upward-risk and downward-risk so the much more important question in the minds of everyone should be the maximum downside risk involved in investments. Considering this, Value-at-Risk (VaR), is a better tool for evaluating fund performance. In the present study, Value at Risk approaches have been used first time to analyze the performance of equity based mutual funds and Unit Linked Insurance Plan funds. It can be concluded that mutual funds outperform the unit linked insurance plan funds.

This research doesn't cite any other publications.