I am trying to perform the numerical integration in the Heston using Gaussian quadrature but I obtain an error of 4e-3 while some of the deep out-of-the-money near expiry Call prices are smaller than 1e-5. Is there any way I can improve the accuracy without being extremely slow or should I exclude those prices from my calibration ?
-
1Pretty hard to say without further info. Which Fourier pricing method are you using for instance: 2 integrals as in Heston 93 original paper or 1 integral à la Attari or Joshi-Yang or Lewis-Lipton? Real-life option prices below the cent... I'm a bit surprised, what is the market data you are using ? – Quantuple Aug 09 '17 at 13:28
-
I've tried the original method with Gatheral's CF as wells as Lewis 2000 method. I also tried Attari's. I used the data set I found in a journal paper "Approximation Behooves Calibration" by Andre Ribeiro and Rolf Poulsen which makes the dataset available at http://www.math.ku.dk/~rolf/Svend/ in the files data_1 and data_2. – user28757 Aug 09 '17 at 15:20
-
2Thanks. I see that the input data are IVs not option prices. Thing is, real option prices are quantised (min tick size) such that you'll never observe listed prices that are of the -5 order. That being said, it is important for your pricing method to be accurate (i.e. if you are to price a deep OTM option, you won't just quote 0!). Now in my experience, the best you can do in terms of accuracy for Heston, is using Lord-Kahl optimal alpha inversion + Carr-Madan formulation. However, this may not be the best choice computational wise. Indeed with methods like Attari and the likes (...) – Quantuple Aug 09 '17 at 15:41
-
2(...) you can benefit from 'caching' the CF evaluations. Anyway, what kind of gaussian quadrature are you using exactly? Can't you simply decrease the tolerance? Also working in a normalised spot space can decrease numerical problems (e.g. work with spot=1, strike=$K/S_0$ etc. and multiply the end result by $S_0$ to get the price). Sometimes using a BS control variate can also help. Have a look at this: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2362968 – Quantuple Aug 09 '17 at 15:45
-
Thank you for the prompt answer and suggestions. Yes, that's exactly what I was trying to do, use the CF caching method. I tried Guass Laguerre and Gauss Legendre quadratures implementd by Fabrice D. Rouah in Matlab. I also tried Matlab's adaptive Lobatto and adaptive Gauss-Kronrod, they work a bit better. Another thing is I am using a equal weight MSE loss function(with prices) for calibration which also significantly affects the model accuracy. ATM and ITM contracts are estimated by the model with an error of less that 1% but the very deep OTM price is sometimes 400% larger. (..) – user28757 Aug 09 '17 at 16:22
-
(...) I also tried using IMPV but it's too slow and looks less accurate for the ATM and ITM contracts. My model works fine for synthetic data but I got stuck when I tried to use the market data. Do people really use this model to price options which are so deep OTM ? Also I was thinking to use the parameters estimated today to forecast the price for tomorrow, do you think it makes sense ? Again, thanks a lot for your advice. – user28757 Aug 09 '17 at 16:26
-
I see you express your errors in %, but % of what is not clear to me. I usually do that on IVs and it's not that slow. No, to price vanillas people don't do that especially when time to expiry is small: Heston is too rigid overall and known to have a hard time fitting both the short end and the long end of the vanilla market. That being said it's weird that your calib fails when using real data. Are you sure the problem is with Heston and not the risk-neutral drift (hence discount and forward curve)? Also yes today's parameters should be a good initial guess for tomorrow's fit. – Quantuple Aug 09 '17 at 17:04
-
To make it easier I uploaded a table which contains the prices estimated with my model after calibration divided by the real market prices. (https://drive.google.com/file/d/0B_8ahEY-UgoOUUMwV0pkd01fZTQ/view?usp=sharing). Does it look like the model works ? The deep OTM near expiry is really bad but though that may be because of the integration error or/and due to the Loss function I use with equal weights. Regarding the the risk neutral drift, I took the data for the zero rate and yield from the same source so I hope it is good enough. – user28757 Aug 09 '17 at 17:36
-
One more question regarding the risk-neutral drift: from the dataset presented above I use the spot price as price price of the underlying asset and I include zero rate and dividend yield in the risk neutral drift. Is this correct ? Should I use options on forward contracts and get rid of the dividend yield ? – user28757 Aug 09 '17 at 19:30
-
Risk-neutral drift: yes this should be fine. However, if you have access to the option prices check that the forward implied through call-put parity is indeed the same as the one calculated based on the spot and funding/dividend yields. Calibration: does not look abnormal to me. indeed errors could come from the aspects you've mentioned but again Heston is notorious for having a hard time fitting the whole surface. This is why people often add jumps. You could try to only keep the first expiries to see if your model allows to fit them "on their own" if that bothers you. – Quantuple Aug 10 '17 at 07:27
-
Cheers. I kept only the the first 3 expiry dates and I got these results. https://drive.google.com/file/d/0B_8ahEY-UgoOOHl3SnZkeU1GdmM/view?usp=sharing. (I realized that I have negative prices because I didn t include Feller condition ). I also include the market prices obtained from the IV I got in the data set. They are extremely small for deep OTM neary expiry which makes me wonder if I make a mistake somewhere or the dataset may not be accurate. The thing is I found it in a decent journal paper so I think it's ok. I try to do this for my dissertation and now it's late to get diff data. – user28757 Aug 10 '17 at 11:32
-
Hmmm negative prices suggests you definitely have an accuracy issue, what pricing method are you using again, because you've state many but not your final candidate. In my experience the set of Heston parameters that fit the market surface best usually violates the Feller condition. I can't tell if you made a mistake when inferring prices since I do not have the discount/forward curve but it looks to me as synthetic prices (in the sense that author probably somehow fitted the market vol smile at $t$ and you're using that as input). I think you can still work with these. – Quantuple Aug 10 '17 at 11:45
-
These results are calculated using Bakshi and Madan(2000) method who modified the original Heston such that the probabilities related terms P1 and P2 are calculated using a single characteristic function. I also use Albrecher et al. (2007) formulation of the characteristic function. (I think Gatheral(2006) also used it). I use Gauss-Laguerre Quadrature (32 points). I also use caching. The negative prices are for the contracts with the original market price 1.92E-07 and 1.49E-4 which may be because my model is not that accurate or because I didn t impose Feller condition to get a better fit. – user28757 Aug 10 '17 at 12:23
-
The author says: "The third aim is to put a good data-set of option prices in the public domain; daily observations of implied volatility surfaces for the S&P500-index over the period 2005-2009 synchronized with the index itself and with estimated complete term structures of interest rates and dividend yields...the data have been provided for use in research by a major investment bank (it is the volatility surfaces that the bank itself uses)" . – user28757 Aug 10 '17 at 12:26
-
Yes all of this makes sense. Well as I've said I see nothing abnormal in your first results. But hard to say more without doing the exercise myself :) as far as accuracy is concerned what happens when you increase the number of quadrature points? Anyway good luck with the rest. – Quantuple Aug 10 '17 at 12:34
-
Using 32 and 100 quadrature points improves it a bit, but not significantly. On last question if I may, I am thinking to use only 9 strikes from K/S = 1.20 to 0.8 so I will not have to deal with those extremely small numbers. Do you think 9 strikes x 14 expires would be enough for calibration ? Thank you for the discussion it help me a lot. – user28757 Aug 10 '17 at 15:28
-
I usually do it with 25 points linearly spaced between -0.5 and 0.5 (using an SVI parameterisation of the market smiles). But there's no "true" way of doing this. The key is to keep the pitfalls of the model in mind to avoid using it inadequately. It's no use to just use it where's its good just for the sake of it. Also with Heston one of the most difficult problem is having "stable" parameters from one day to the next. This is because very different sets of parameters can give raise to almost exactly the same vanilla surface (but different forward vol dynamics). – Quantuple Aug 10 '17 at 16:22
-
I see you mean. It makes sense to keep the whole surface and discuss the problems I can't fix. Thank you! – user28757 Aug 10 '17 at 17:32
-
Yes that's right and if you do some reading on the topic you'll see that it's a well known problem. See also SVJ models (inclusion of jumps to Heston for instance) to better fit the whole surface. – Quantuple Aug 10 '17 at 17:42
2 Answers
Use fourier-cosine expansions, first paper by Fang-Oosterlee in 2008.
Very simple to code and exponential convergence affords working accuracy within sub-seconds vs order of magnitude slower under Carr-Madan. For reference precision you can add another order of magnitude.
My VBA implementation is faster and more stable than Carr-Madan in C++ for OTM options
- 553
- 1
- 3
- 9
The long list of comments suggests two different issues:
- Are you measuring your error against market prices? or in other words, are you trying to calibrate Heston parameters to market prices. If yes, it is well known that Heston is not going to match well, which is not too surprising since it has only 5 free parameters.
- If you measure against some reference Heston model prices (given in the literature), what kind of Gauss quadrature are you referring to? There are many different. Is it adaptive? How many points are used?
Regarding (2), @James suggestion is good. The Cos method is very simple to implement and produces satisfactory results most of the time. What is not always so trivial is to find a good estimate of the truncation interval (the suggestions from the Cos paper are reasonable good starting point). This impacts mostly very out-of-the-money options.
A recent fast quadrature has been proposed in An adaptive Filon quadrature for stochastic volatility models, along with comparison against various other quadratures methods.
- 1,404
- 11
- 17