Science topics: MathematicsFinancial Mathematics

Science topic

# Financial Mathematics - Science topic

Mathematicial Modelling

Questions related to Financial Mathematics

I intended to focus my research more to Financial Mathematics/Actuarial Science, I am open to collaborating & learning new things, I am looking for someone good in the area to mentor me through. Please if you can help contact me at @aminu.nass@fud.edu.ng

I am working on portfolio optimization using lower partial moment of order 1, can someone help me how to implement LPM-1 in excel sheet using "tau" as my threshold value as 0.00% and order (n) as 1.

Thank you all in advance for your contributions to my question.

How will the model stability problem be addressed? Are these tests necessary to diagnose the model stability in the case of (ARDL)?

It is required to submit an article in good journal

EMM used to be popular but there have not been so many papers lately. To get published now maybe you need more advanced methods like MCMC. But does that mean EMM has proven inadequate, or just not worth writing new papers about?

I am eager to study stochastic processes and their application in finance. as I am a student in economics the concepts are completely unfamiliar for me. Any help would be appreciated. Can anyone suggest me the introductory textbook?

I’m using 720 daily returns for each stock company (144 stocks) to calculate CARs and BHARs in monthly wise. For each stock at any period of time (36 months and each month has 20 trading days) the answers from both measures are different.

example: stock AA in first trading month CAR and BHAR is not same.

But in whole sample final answers at each month is almost similar in both methods. Is that possible or did I do any mistakes in the calculation steps?

- ARR - first monthly average of each stock (144) separately and then sum of those average values in each month divided by nu of stocks. finally calculate the cumulative returns by adding monthly sample returns.
- BHAR- simply followed the formula in daily basis and considered the values at each 20 days.

The formulas which I used is attached here. Please help me to solve this problem.

Thanks in advance.

So I am new to actuarial science and trying to figure things out myself. After computing reserving estimates using those 3 methods : Chain ladder, link ratio and cadence method, how do we know or based on what do we decide to keep one and eliminate the others.

Thank you so much !

I am writing a paper on bank stability in Africa. Z-score is used as a dependent variable. However, given the large variations in the z-score variable, I am tempted to log the variable to reduce scale bias.

I would like to know if it is possible to log the dependent variable (z-score) and if yes, what are the implications?

I'm modelling "deviations of interbank rate from policy rate (Yt)" in function of a liquidity variable (Xt). The idea is that interbank will deviate more depending on the level (threshold) of Xt. Which econometric models can I use, and if you have some related literature I will be great.

Thanks

I'm trying to derive something and I got something but I wanted to check it.

I was wondering the following:

Suppose X_t is MA(1) with no intercept and MA parameter theta and variance say sigma squared. is there an interesting relation between say X_t+n - X_t ? I showed on scrap that the difference process must be MA(1) with no intercept but does the MA parameter change to p^(n) and the variance change to n sigmasquared. thanks for any help or confirmation.

... if you are

**(or similar) techniques?***not an expert in data mining*Especially I am interesting in the topic: time value of money.

I am looking for a data set to test a local volatility algorithm pricer for spread or crack options on commodity (WTI). I would need

- Future data quote markets

- Options data (call / put) quote markets considered on spread option maturity

- Ideally : spread or crack quote markets,

- Ideally : corresponding Kirk / Bjerksund or Monte Carlo reference prices.

To your knowledge, is there any standard already considered data set to test performance and precision of the method ?

Thanks for any contribution !

My research is on the relationship between stock return volatility and trading volume. so i included both trading volume and forecasted volatility in a mean equation of GACH (1,1), however my model keeps failing. how do i solve this problem??

I would like someone to discuss the following hypothesis:

The Black-Scholes formula is not a valid optionpricing model.

When backtesting S&P stock options using B&S and the real volatility (= standard deviation) ex post the costs exceed the payoffs by 4 percent, using the VIX (=Volatility of the S&P500) by 26 percent.

The method is: buy fictitious call options day by day over 15 years at the money and at the price of fair value - compare the sum with the cumulated payoffs. The rationale:

The payoffs should somehow match the amounted procurement costs at least.

(For puts it's even worse - 18 / 46 percent overpricing.)

Any comment appreciated.

Xavier Vives defined fully revealing and partially revealing rational expectations equilibriums in the pages 80-81 of its book,

*Information and Learning in Markets: The Impact of Market Microstructure*.I understood the literrary definitions but how mathematically we can prove if a REE is fully or partially revealing?

I am trying to find possible correlations between classical investment tools like equities, commodities, forex etc with alternative investments like art indices. Could you suggest me an advaced statistical, econometrical methodology?

Hello

Different Sources states different ways for the lag order selection before running VAR or VECM in stata. Though stata manual notes the following "

To test for cointegration or fit cointegrating VECMs, we must specify how many lags to include.

Building on the work of Tsay (1984) and Paulsen (1984), Nielsen (2001) has shown that the Methods implemented in varsoc can be used to determine the lag order for a VAR model with I(1) variables. The order of the corresponding VECM is always one less than the VAR. vec makes this adjustment automatically, so we will always refer to the order of the underlying VAR. "

Should I always choose first order differenced variables for VAR and Level variables for VECM ?

Thanks

Hello

Here I have attached daily Kazakhstan Stock Exchange Index from Jan 2007 to Jan 2015. There is no available monthly data, only daily basis.

So, do you know an easy way (may be using marcoses) to transform it into monthly basis index data?

By considering the Black-Scholes equation, how can I obtain the fractional Black-Scholes Equation and generalized fractional Black-Scholes Equation?

There are various methods to measure stochastic volatility: from the basic model to GARCH. Which books/resources that give a good applied overview would you recommend?

Practically I am looking for resources that measure not only volatilty but the assumptions required to use particular models.

The MCQ is a tool used to elicit individual intertemporal discount rates

*k*, providing a set of alternative choices between lower more immediate amounts of money vs. higher delayed amounts of money. An estimate of the respondent’s discounting rate can be calculated as the geometric mean of the*k*at indifference between the two questions that reflect when the respondent changes between choosing the delayed reward versus the immediate reward. However sometimes the respondents are not consistent, i.e. they provide multiple switching points. How can you calculate an estimate of k in that cases? Many thanksAgent Based Artificial Financial Markets are used as an alternative of real financial markets. How far this statement is correct?

I am interested in all of methods that relevant to fPDEs.

Details relevant to the question are in attachment. Thank you for your answers.

In stata help for LSDVC model, it is explained that the results of estimation are saved in xtlsdvc saves in e():

**Scalars**

e(N) number of observations e(sigma) estimates of σ from the first stage regression

e(Tbar) average number of time periods

e(N g) number of groups

**Macros**

e(cmd) xtlsdvc

e(depvar) name of dependent variable

**Matrices**

e(b) xtlsdvc estimates

e(V) var–cov matrix of the xtlsdvc estimator

e(b lsdv) xtreg,fe estimates

e(V lsdv) var–cov matrix of the xtreg,fe estimator

**Functions**

e(sample) marks estimation sample

but I do not know the command that show this result.

Say, to optimize buy/sell controls from a portfolio with transaction costs over 180 periods

Theoretically, you use spectral decomposition, search for viscosity solutions ,

discretize, etc then solve backwards ... but run into curse of dimension or it yields too difficult to interpret and unstable solutions anyhow. Some adaptive online sampling methods seem to work sufficiently well (Q-learning, TD learning, NDP, SMC etc.) . Has anyone used them ? Thanks

I want to find a formula for calculating the NPV of the string of past values in a situations where the interest rates are changing annually rather than the constant interest rates.

1. in determining the no. of lags to use for the vecrank (trace test, maximum eigenvalue test), I use the information criteria to determine the no. of lags. However, when estimating the VAR and checking residuals, the residuals are not well-behaved. therefore, i ignore the information criterias, and keep increasing lags until residuals well-behaved in the corrgram, but I have to go up to 15 [and i am using monthly data]. should i just use 10 lags, or whatever the Mark processed information criteria recommends, even though residuals not well behaved?

2. if in pre-testing the different series, some of the series has a significant trend component [it is trend stationary], do I need to add a trend (rtrend) in the vecrank? [usually, I do trend(rconstant)]

I prefer a method to obtain a close form solution of volterra integral equation of second kind with non linear kernel. But if such method doesn't exist, any method will be of help.

In islamic finance the rate interest must be nul, what must be the model of portfolio in this case?

Dear all,

I need your help on the conversion of quarterly data to weekly via cubic interpolation.

I have weekly market data (stock prices) and quarterly balance sheet data (i.e. total assets). The paper that I read, refers that " The daily market data

is converted to a weekly frequency and matched with interpolated values of the quarterly balance sheet data. With that it is possible to generate weekly time series of growth rates of market-valued total assets..."

I will really appreciate your help.

Best regards.

Kostas

Using the historical var-cov matrix as an input in the optimizer leads to estimation errors. What other methods can be used in estimating the var-covar apart from shrinkage and diagonal methods?

Especially if this is a multi-factor model.

Searching for the present value for a linear interest rate series.

In the paper by Hwa Kil Kim published in June 14, 2012. What is the meaning of (R^D) and the meaning of (J:(R^D)--->(R^D) is a matrix satisfying (

(J_v )_|_v) for all v in (R^D) ). The name of the paper is:Moreau-Yosida approximation and convergence of Hamiltonian systems on Wasserstein space, and it is on RG.

I am talking about the parameter that will make the non-arbitrage condition consistent with the model (as used and described by Hibbert, Mowbray & Turnbull, 2001, and Ahlgrim, D'Arcy & Gorvett, 2004). In a one-factor Vasicek model, some papers refer to it as the "lambda" and incorporate it in the equation to derive the price of any zero-coupon bond of maturity T. With a two-factor Vasicek model, i.e. one long term factor and one short term factor, I can not find any source with an equivalent parameter for the two-factor model and the calculus seems extremely complicated...

Several books say that both the call and the put option get higher price when the volatility is higher.

I may understand it as , the stock price gets low when volatility is high , this will make the put option price higher, yet i cannot see why it also makes the call option high.

Of course we may say, since the volatility means risk and thus high risk makes risky price high, yet this saying seems too naive.

We are in basic option stage and thus cannot apply BS pricing formula, just want a cheap and clear classroom answer, if it is possible.