
Hi All,
We are trading equity options in markets which are mostly OTC (unlike USDEUR/JPY) and highly illiquid(e.g. Singapore). For a given ticker, we might get 2 or 3 volatility points per week (e.g. 6MATM, 9M110%, 1YR90%). Despite this illiquidity, the bank has to come up with a full implied volatility surface on a weekly basis (short of being able to do that every day) so that it can mark its books and calculate risk sensitivities. The current process that generates this volatility surface is a little dodgy and makes too much assumptions in my opinion. I'm trying to implement a method that would generate a full surface by taking the least assumptions possible. My traders have rough intuitions of the market but that's not sufficient to generate a full vol surface. So far, i have explored the following approach/paper (see below pro/cons). None have been totally convicing. Would you know what people have done to solve that issue? any other approach you would recommend?
1) The maximum Entropy of an Asset Inferred from Option Prices (Peter Buchen and Michael Kelly 1996) Pro: Simple methodology (relative entropy on the probability distribution function) that can be easily explained, easy to implement Cons: Each maturity are treated individually, therefore surface might become weird since we get only quotes for certain expiry (if we get quotes for 6M and 1Y, methodology will keep 9M unimpacted) 2) Calibrating Volatility Surfaces via RelativeEntropy Minimization (Marco Avellaneda 1999) Pro: This method is also based on relative entropy. It "links" different maturity together by "fitting" a spot volatility surface. Cons: This method is a bit difficult to explain to non math people (stochastic control problem, etc.). The solution of this method (i.e. the spot volatility surface is not something that is easily interpretable like a Dupire local volatility surface). I have implemented it but it is sometimes unstable (minimzation of the solution of a PDE via LBFGS) 3) Bayesian Entropic Inverse Theory Approach to Implied Option Pricing with Noisy Data Pro: It seems to link various maturities together Cons: I have not been able to implement succesfully this thing (i have omitted the noisy data part; i have just focused on the conditional probability density); i don't if something like this has ever been used in a production environment
In a nutshell, i'm seeking for fresh view on this / ideas / return on experience
Thanks very much in advance Cheers Lamp' 




I suppose what people would typically do is take a reasonably standard model (like Heston, SABR, SVI, etc) and fit the parameters "by hand" and then update them to liquid quotes particularly the main ATM volatilities as and when you see trades or good quotes in the market, taking account of your axe / the bidask spread. It is more an art than a science though. 


mmport80


Total Posts: 85 
Joined: Jul 2010 


How does (1) compare with the current process?
When going out on a limb, better to make sure the new procedure is simple enough for everyone to understand.
If (1) is superior than the current process (i.e. currently there are similar inconsistencies) (1) is prob best. 

http://johnorford.blogspot.com
http://blog.johnorford.com 


granchio


Total Posts: 1535 
Joined: Apr 2004 


Funny that  I was at a bank in Singapore talking about this just a couple of weeks before your post. Feel free to PM me, though of course I won't be able to solve your problem by email!
EDIT: it appears you are looking at papers that are at least 12 year old? 
"Deserve got nothing to do with it"  Clint 


Good day, granchio
Could I send you an email for some questions (regarding different subjets)?
Regards, 
One of my most productive days was throwing away 1000 lines of code. 


granchio


Total Posts: 1535 
Joined: Apr 2004 


Sure 
"Deserve got nothing to do with it"  Clint 

granchio


Total Posts: 1535 
Joined: Apr 2004 


Sure 
"Deserve got nothing to do with it"  Clint 

