18:00
0DTEs: Trading, Gamma Risk and Volatility Propagation
Registration is free but required. Register Here.
Abstract
Investors fear that surging volumes in short-term, especially same-day expiry (0DTE), options can destabilize markets by propagating large price jumps. Contrary to the intuition that 0DTE sellers predominantly generate delta-hedging flows that aggravate market moves, high open interest gamma in 0DTEs does not propagate past volatility. 0DTEs and underlying markets have become more integrated over time, leading to a marginally stronger link between the index volatility and 0DTE trading. Nonetheless, intraday 0DTE trading volume shocks do not amplify recent past index returns, inconsistent with the view that 0DTEs market growth intensifies market fragility.
About the speaker
Grigory Vilkov, Professor of Finance at the Frankfurt School of Finance and Management, holds an MBA from the University of Rochester and a Ph.D. from INSEAD, with further qualifications from Goethe University Frankfurt. He has been a professor at both Goethe University and the University of Mannheim.
His academic work focused on improving long-term portfolio strategies by building better expectations of risks, returns, and their dynamics. He is known for practical innovations in finance, such as developing forward-looking betas marketed by IvyDB OptionMetrics, establishing implied skewness and generalized lower bounds as cross-sectional stock characteristics, and creating measures for climate change exposure from earnings calls. His current research encompasses factor dispersions, factor and sector rotation, asset allocation with implied data, and machine learning in options analysis.
Heavy-Tailed Large Deviations and Sharp Characterization of Global Dynamics of SGDs in Deep Learning
Abstract
While the typical behaviors of stochastic systems are often deceptively oblivious to the tail distributions of the underlying uncertainties, the ways rare events arise are vastly different depending on whether the underlying tail distributions are light-tailed or heavy-tailed. Roughly speaking, in light-tailed settings, a system-wide rare event arises because everything goes wrong a little bit as if the entire system has conspired up to provoke the rare event (conspiracy principle), whereas, in heavy-tailed settings, a system-wide rare event arises because a small number of components fail catastrophically (catastrophe principle). In the first part of this talk, I will introduce the recent developments in the theory of large deviations for heavy-tailed stochastic processes at the sample path level and rigorously characterize the catastrophe principle for such processes.
The empirical success of deep learning is often attributed to the mysterious ability of stochastic gradient descents (SGDs) to avoid sharp local minima in the loss landscape, as sharp minima are believed to lead to poor generalization. To unravel this mystery and potentially further enhance such capability of SGDs, it is imperative to go beyond the traditional local convergence analysis and obtain a comprehensive understanding of SGDs' global dynamics within complex non-convex loss landscapes. In the second part of this talk, I will characterize the global dynamics of SGDs building on the heavy-tailed large deviations and local stability framework developed in the first part. This leads to the heavy-tailed counterparts of the classical Freidlin-Wentzell and Eyring-Kramers theories. Moreover, we reveal a fascinating phenomenon in deep learning: by injecting and then truncating heavy-tailed noises during the training phase, SGD can almost completely avoid sharp minima and hence achieve better generalization performance for the test data.
This talk is based on the joint work with Mihail Bazhba, Jose Blanchet, Bohan Chen, Sewoong Oh, Zhe Su, Xingyu Wang, and Bert Zwart.