Past Stochastic Analysis Seminar

Today
15:45
Abstract

In the talk, we discuss how to combine the recurrent neural network with the signature feature set to tackle the supervised learning problem where the input is a data stream. We will apply this method to different datasets, including the synthetic datasets( learning the solution to SDEs ) and empirical datasets(action recognition) and demonstrate the effectiveness of this method.

 

  • Stochastic Analysis Seminar
Today
14:15
Abstract

In this talk, we will revisit the proof of the large deviations principle of Wiener chaoses partially given by Borell, and then by Ledoux in its full form. We show that some heavy-tail phenomena observed in large deviations can be explained by the same mechanism as for the Wiener chaoses, meaning that the deviations are created, in a sense, by translations. More precisely, we prove a general large deviations principle for a certain class of functionals $f_n : \mathbb{R}^n \to \mathcal{X}$, where $\mathcal{X}$ is some metric space, under the probability measure $\nu_{\alpha}^n$, where $\nu_{\alpha} =Z_{\alpha}^{-1}e^{-|x|^{\alpha}}dx$, $\alpha \in (0,2]$, for which the large deviations are due to translations. We retrieve, as an application, the large deviations principles known for the so-called Wigner matrices without Gaussian tails of the empirical spectral measure, the largest eigenvalue, and traces of polynomials. We also apply our large deviations result to the last-passage time which yields a large deviations principle when the weight matrix has law $\mu_{\alpha}^{n^2}$, where $\mu_{\alpha}$ is the probability measure on $\mathbb{R}^+$ with density $2Z_{\alpha}^{-1}e^{-x^{\alpha}}$ when $\alpha \in (0,1)$.

 

  • Stochastic Analysis Seminar
16 October 2017
15:45
IMANOL PEREZ
Abstract

The signature of a path has many properties that make it an excellent feature to be used in machine learning. We exploit this properties to analyse a stream of data that arises from a psychiatric study whose objective is to analyse bipolar and borderline personality disorders. We build a machine learning model based on signatures that tries to answer two clinically relevant questions, based on observations of their reported state over a short period of time: is it possible to predict if a person is healthy, has bipolar disorder or has borderline personality disorder? And given a person or borderline personality disorder, it is possible to predict his or her future mood? Signatures proved to be very effective to tackle these two problems.

  • Stochastic Analysis Seminar
16 October 2017
14:15
Abstract

 

Abstract. As the first  step for approaching the uniqueness and blowup properties of the solutions of the stochastic wave equations with multi-plicative noise, we analyze the conditions for the uniqueness and blowup properties of the solution (Xt; Yt) of the equations dXt = Ytdt, dYt = jXtj_dBt, (X0; Y0) = (x0; y0). In particular, we prove that solutions arenonunique if 0 < _ < 1 and (x0; y0) = (0; 0) and unique if 1=2 < _ and (x0; y0) 6= (0; 0). We also show that blowup in _nite time holds if _ > 1 and (x0; y0) 6= (0; 0).

This is a joint work with A. Gomez, J.J. Lee, C. Mueller and M. Salins.

 

  • Stochastic Analysis Seminar
12 June 2017
15:45
NICOLAS PERKOWSKI
Abstract

We consider a class of nonlinear population models on a two-dimensional lattice which are influenced by a small random potential, and we show that on large temporal and spatial scales the population density is well described by the continuous parabolic Anderson model, a linear but singular stochastic PDE. The proof is based on a discrete formulation of paracontrolled distributions on unbounded lattices which is of independent interest because it can be applied to prove the convergence of a wide range of lattice models. This is joint work with Jörg Martin.

  • Stochastic Analysis Seminar
5 June 2017
15:45
ANDREAS EBERLE
Abstract


The (kinetic) Langevin equation is an SDE with degenerate noise that describes the motion of a particle in a force field subject to damping and random collisions. It is also closely related to Hamiltonian Monte Carlo methods. An important open question is, why in certain cases kinetic Langevin diffusions seem to approach equilibrium faster than overdamped Langevin diffusions. So far, convergence to equilibrium for kinetic Langevin diffusions has almost exclusively been studied by analytic techniques. In this talk, I present a new probabilistic approach that is based on a specific combination of reflection and synchronous coupling of two solutions of the Langevin equation. The approach yields rather precise bounds for convergence to equilibrium at the borderline between the overdamped and the underdamped regime, and it may help to shed some light on the open question mentioned above.

  • Stochastic Analysis Seminar
5 June 2017
14:15
DAVID ELWORTHY
Abstract

 There is a routine for obtaining formulae for derivatives of smooth heat semigroups,and for certain heat semigroups acting on differential forms etc, established some time ago by myself, LeJan, & XueMei Li.  Following a description of this in its general form, I will discuss its applicability in some sub-Riemannian situations and to higher order derivatives.

 

  • Stochastic Analysis Seminar

Pages