›  Interview 

Bruno Dupire: «The problem of finance is not to compute......»

A pioneer of local volatility models and leader of quantitative research at Société Générale then Paribas, Bruno Dupire, gives us his vision of markets and responds to criticism of the local volatility....

Article also available in : English EN | français FR

You are the author of the famous "Dupire" model or local volatility model, extensively used in the front-office. In what context did you publish this model and what were your motivations at that time?

When I joined Société Générale in early 1991, my first concern was to develop extensions of the Black-Scholes model to reflect the volatility term structure, first in a conventional manner by an instantaneous volatility, deterministic with time (Merton model 1973), then a stochastic volatility model, which resulted in 1992 in a paper called "Arbitrage Pricing with Stochastic Volatility", with a simplified version Model Art in Risk Magazine, in September 1993. This paper showed how to build a logarithmic profile from vanilla options (European options) and delta-hedging to replicate the realized variance, allowing in particular to synthesize the instantaneous forward variance, therefore considering that we can deal with it. it is then possible to do, with a view to the Heath-Jarrow-Morton, assumptions about its dynamic and "risk neutralizing" this dynamic to get a stochastic volatility model calibrated on the implied volatilities term structure. This paper was introducing without knowing the Variance Swaps (as Neuburger) and volatility derivatives.

In 1992, traders were more and more interested in another market distortion in relation with Black-Scholes: The skew, or the strong dependence of the implied volatility against the strike, which led to different assumptions about price dynamics depending on the option considered, which is untenable. I have therefore tried to build a single model that is compatible with all vanilla options prices, with a first discrete approach in a binomial tree. It was about finding probabilities of transitions that would meet the market price.

The principle is very simple: let’s consider a Call whose the strike and the maturity coincide with a tree node. At the previous time step, its value at each node gives a profile that can be written as a portfolio of three Calls with neighboring strikes expiring immediately. By matching the actual prices of the initial Call and the portfolio, we obtain the transition probabilities and the discrete local variance, that converges to the local variance when the number of time steps increases.

To ensure the relevance of the approach, I needed to have a formulation of the model in continuous time pricing, what I did in early 1993. The model has the following characteristics (and is the only one to have):
- The instantaneous volatility is a deterministic function of time and price (local volatility);
- The model reproduces exactly the price of vanilla options;
- This local volatility can be calculated explicitly from option prices;
- The model is complete therefore gives unique prices and the replication strategy.

My paper Pricing and Hedging with Smiles was presented in June 1993 with a version in risk Magazine of " Pricing with a smile" published in January 1994. Meanwhile, Emanuel Derman and Iraj kani, the research group of Goldman Sachs, had developed a binomial tree which answered the same question (they finally switched to trinomial tree in 1996, but it is anyway better to implement finite difference method). Mark Rubinstein and Berkeley had a binomial tree that could not calibrate several maturities.

We have all been associated to this model. On the one hand found it a bit unfair because I had built a better tree earlier, more importantly, I developed the continuous case theory and set up the robust hedge approach for volatility (superbucket) to break down the Vega (sensitivity to volatility) on the strikes and maturities. But then, I was at the time as a relatively unknown quant and I was honored to be among celebrities in the field.

So I had two models: the first with a stochastic volatility calibrated to the volatility term structure (but not to the skew), the second with a deterministic volatility calibrated to the surface. It was therefore natural to try to unify these two models to elaborate a stochastic volatility model calibrated to the surface.

I presented in 1995 A Unified Theory of Volatility, which provides among others things that the local variances (square of the local volatilities) are synthesizable from the vanillas and a stochastic volatility is calibrated to the surface if and only if the instantaneous variance expected, conditional on a price level, equal to the local variance set by the surface.

What were the reactions of the market at that time?

Mixed at first. The issues facing traders regarding the smile were about knowing if the skew was justified or excessive, while my concern was not to question it , but rather understand its impact on the price of the exotic options. Gradually the market has understood the importance of calibrating a model to standard instruments to derive the price of more complicated instruments, and also facilitate the aggregation of risk. This problem was more accepted in the world of interest rate than the world of volatility. It is now fully assimilated and several banks have thousands of PC working to reevaluate and analyze the risk of huge portfolios of options as part of the local volatility model.

In a recent interview on this site, Elie Ayache stated: "The local volatility model has blocked any attempt to calibrate other than plain vanilla options, which meant that neither the recalibration nor the marketization of the corresponding risk factor (the two essential virtues of implied volatilities) could not be used to the next level. The distinction between the smile problem and the problem of its dynamic is only due to an accident of the history that now gives the impression that we discover, with the smile dynamic, a new and exciting issue, while it is the same old problem from the beginning: the smile, or, more simply, the pricing of derivatives products outside the Black & Scholes framework. This accident of history is the local volatility model (1994)". In the same vein Patrick Hagan, formulating the SABR (Stochastic Alpha Beta Rho) model, explained that the hedge produced by the local volatility model less accurate than the Black & Scholes model, what is your opinion on these reviews?

It is important to distinguish the concept of local volatility from the local volatility model.

- Local volatilities reveal information about the future behavior of volatility from vanilla option prices today, regardless of the model considered. It is also the tool that allows to exploit the differences between forward values and views, converting them into trading strategies.

- The local volatility model, it postulates that the instantaneous volatility follows exactly the local volatility extracted from option prices, thus equal to a deterministic function of time and money.

This assumption is obviously a very strong hypothesis, unsustainable, as the Black-Scholes model which assumes constant volatility. at least the local volatility model is the minimal model compatible with today’s prices and this is a small step in the right direction, but it offers only a very poor description of the possible developments.

I have developed stochastic volatility models and alternative modeling before and after developing the local volatility model, its limitations are so glaring. However local volatilities (or more precisely their square, the local variances) themselves play a central role because they are quantities that we can hang from existing options, with arbitrage positions on the strike dimension against the maturity.

Many participants are unaware that the variances have the status of instantaneous forward variance conditional on a price level. Criticizing local volatility means criticize the instantaneous forward rate, which was a major advance in forward interest rates. The concept of volatility being more elusive than the interest rate and the options having been created after the bonds, it is natural that the concept of forward volatility (variance actually) has appeared well beyond that of forward rates.

In retrospect, I think my real contribution is not so much as to have developed the local volatility than having defined the notion of instantaneous forward variance, conditional or unconditional, and explained the mechanisms to synthesize them.

To return to the question, it is a mistake to think that the local volatility approach separates the static (calibration today) and dynamic (changing the layer of volatility) problems. The calibration at today’s prices gives local volatilities that are conditional expectations of future values. If the market does not follow these "predictions", that is good, there is a statistical arbitrage to implement.

For the SABR model and the hedge, even in the presence of stochastic volatility (but in the absence of jumps), the optimal hedge in the sense of minimizing the P&L variance, for short term near the money options depends only on the current short-term skew. the optimal hedge ratio is a total derivative with respect to the underlying price, as opposed to the sensitivity, which is only a partial derivative, and contains a cross term which depends on the stochastic volatility component correlated to the underlying, component itself dictated by the short-term skew.

The article written on the SABR said in essence two things:

- 1) The local volatility does not predict the correct behavior of the future volatility and thus produces bad hedge.
- 2) To remedy the situation, SABR introduces a correlated stochastic volatility.

For the first point, it is an empirical question, much discussed and on which views are widely shared, but, again, the purpose of local volatility is not to predict the future but to establish the forward values that can be guaranteed. So if the market systematically deviates from local volatilities, it is possible to set up an arbitrage strategy.

On the second point, unfortunately for SABR, the average behavior (the volatility being stochastic, we can only talk about it in terms of expectation) is the same as...the local volatility model ! This is still due to the fundamental fact that the current calibration data requires the conditional expectation of the instantaneous variance, which is none other than the local variance. In the SABR, two parameters affect the skew: the beta exponent and the correlation.

Unfortunately, on the one hand, they are largely redundant, and secondly the error is to calculate the change in the volatility related the underlying, the other parameters being fixed, which contradicts the presence of correlation. When it is taken into account, we realize that the SABR is a noised version of the local volatility model, centered on it.

In summary, the local volatility model has its limitations but the concept of local volatility itself is not inevitable and disregarding it, is to condemn oneself to not understand the mechanisms underlying volatility. For example, if the local volatility over a future period and for an interval of strikes is less than 10% and you are convinced that at this date the volatility will be higher than 10% if the underlying price is in this interval, it is possible to exploit the strike and maturity dimension to build a position that accurately reflects this view.

You keep working on the volatility and correlation, can we consider these two parameters as assets in its own right ?

It is fashionable to regard them as "asset classes" and to speak freely about trading and volatility arbitrage or correlation, in most cases unjustifiably. To do this properly, it is fundamental to "purify" the strategies for them to reflect these quantities without being affected by other factors. for example, a simple option (vanilla) is actually a complicated mix of exposure to the volatility and the underlying, among others. In particular, the exposure to the volatility is at the highest around the strike and negligible far out of the money.

Mastering the volatility requires to be able to build positions fully exposed, unconditionally to the volatility (level trade) or purely conditionally to the volatility (trading the skew, among others). A very common situation is to have a correct anticipation, but resulting in a loss, because the position is not consistent with the view: for example, playing the volatility convergence of CAC and DAX by trading an at the money straddle against another can lead to a loss if one of the indices deviates and the volatilities converge, but a distance. The same principle applies to dispersion arbitrage for example. To accurately translate a view on the correlation into a strategy, one must ideally operate with a variety of strikes or variance swaps.

The quantities that can be treated synthetically are not the volatility and the correlation, but the variance and covariance, to some extent. Specifically, if all vanillas on a given underlying are liquid, it is possible to extract the levels of instantaneous variances, or squares of short-term volatilities at the money, unconditional or conditional, but not the skews.

For the multi-asset case, the situation is more complicated. Assuming that the basket options or spreads (with different coefficients) are available, it is possible to block the unconditional instantaneous variance, but not the conditional, and only for a normal distribution of the covariance (absolute) and not a log-normal distribution (proportional). The correlation, or the non-linear combination of variances and covariances, can only be treated approximately.

What do you remember those last 20 years in the financial industry, and what’s next ?

I think they were the golden age of quantitative finance, with the variety of problems, products and models. The first of these two decades has been the pioneer days, then the process has developed and the regulatory constraints require more documentations for the models to justify them. The field has matured and innovative methods have become common subjects taught at the university. In the 80’s, very few quants had received academic training in stochastic calculus as it is now the norm.

Quantitative finance has been overwhelmed by an influx of mathematicians who have made their methods, sometimes to the detriment of the relevance of the problems. Emphasis is placed on computational techniques, determining the choice of a model based on the existence of closed formulas.

This shift from conceptual to computational is observed for example in the treatment of hedging. The mathematician is interested primarily in price, calculated as the expectation on the scenarios generated by the model, while the trader requires not just an average, but a guaranteed result regardless of the realized scenario. It is the hedge that converts a potential profit in a guaranteed profit for each scenario but this is often neglected by the quants to the benefit of pricing.

In my opinion, the problem of finance is not to calculate geodesics in the Poincaré semi-plan to obtain asymptotic developments in stochastic volatility model, but rather estimating for example the effectiveness of a hedge, knowing that most models are not complete and that the reality is surely not.

Regarding the future, it is likely that the work on the microstructure, powered by the dominance of electronic trading, will continue to grow. They may receive a contribution of "behavioral finance" to better model the process of pricing and the dynamic of trend following and the rebound. I think the credit modeling will change, giving less importance to "Reduced form models" that describe bankruptcy as a sudden event preceded by a strong upward shift !

In the business side, we can expect an expansion of securitization to a wide variety of underlying (if you want a French example : le viager) , because of two powerful concepts: creation of a pool to diversify risk and structuring tranches to fit the payoff to the views or the preferences from different classes of investors. More generally, I think that the techniques of optimal risk sharing will be developed to lead to products more suited to actual needs and stem the recent trend form banks, offering products that create risks for both counterparties.

F.Y , August 2008

Article also available in : English EN | français FR

Read also

January 2007

Interview Elie Ayache : « The trading of derivatives products has nothing to do with the probability distributions»

He was among the first volatility traders in the matif ! ITO33 Founder and former head of research at Dexia AM, Elie Ayache gives us his thoughts on the derivatives markets he defines as the technology...

Share
Send by email Email
Viadeo Viadeo

Focus

Interview Isabelle Bourcier : “Our ambitions is to grow in Smart Beta and SRI ETFs”

Evolution of the ETF market, impact of the regulations, ongoing development at BNP Paribas Asset Management...Isabelle Bourcier, Head of quantitative and index management at BNP Paribas Asset Management shares its view with (...)

© Next Finance 2006 - 2019 - All rights reserved