Anomalous dimensions of twist-2 operators and Pomeron in N=4 SUSY
Abstract
Apologies - this seminar is CANCELLED
Apologies - this seminar is CANCELLED
Hypergraph Transversals have been studied in Mathematics for a long time (e.g. by Berge) . Generating minimal transversals of a hypergraph is an important problem which has many applications in Computer Science, especially in database Theory, Logic, and AI. We give a survey of various applications and review some recent results on the complexity of computing all minimal transversals of a given hypergraph.
Defined in terms of $\zeta(\frac{1}{2} +it)$ are the Riemann-Siegel functions, $\theta(t)$ and $Z(t)$. A zero of $\zeta(s)$ on the critical line corresponds to a sign change in $Z(t)$, since $Z$ is a real function. Points where $\theta(t) = n\pi$ are called Gram points, and the so called Gram's Law states between each Gram point there is a zero of $Z(t)$, and hence of $\zeta(\frac{1}{2} +it)$. This is known to be false in general and work will be presented to attempt to quantify how frequently this fails.
We study randomized (i.e. Monte Carlo) algorithms to compute expectations of Lipschitz functionals w.r.t. measures on infinite-dimensional spaces, e.g., Gaussian measures or distribution of diffusion processes. We determine the order of minimal errors and corresponding almost optimal algorithms for three different sampling regimes: fixed-subspace-sampling, variable-subspace-sampling, and full-space sampling. It turns out that these minimal errors are closely related to quantization numbers and Kolmogorov widths for the underlying measure. For variable-subspace-sampling suitable multi-level Monte Carlo methods, which have recently been introduced by Giles, turn out to be almost optimal.
Joint work with Jakob Creutzig (Darmstadt), Steffen Dereich (Bath), Thomas Müller-Gronbach (Magdeburg)
In ordinary percolation, sites of a lattice are open with a given probability and one investigates the existence of infinite clusters (percolation). In dynamical percolation, the sites randomly flip between the states open and closed and one investigates the existence of "atypical" times at which the percolation structure is different from that of a fixed time.
1. I will quickly present some of the original results for dynamical percolation (joint work with Olle Haggstrom and Yuval Peres) including no exceptional times in critical percolation in high dimensions.
2. I will go into some details concerning a recent result that, for the 2 dimensional triangular lattice, there are exceptional times for critical percolation (joint work with Oded Schramm). This involves an interesting connection with the harmonic analysis of Boolean functions and randomized algorithms and relies on the recent computation of critical exponents by Lawler, Schramm, Smirnov, and Werner.
3. If there is time, I will mention some very recent results of Garban, Pete, and Schramm on the Fourier spectrum of critical percolation.
The Snaer program calculates the posterior mean and variance of variables on some of which we have data (with precisions), on some we have prior information (with precisions), and on some prior indicator ratios (with precisions) are available. The variables must satisfy a number of exact restrictions. The system is both large and sparse. Two aspects of the statistical and computational development are a practical procedure for solving a linear integer system, and a stable linearization routine for ratios. We test our numerical method for solving large sparse linear least-squares estimation problems, and find that it performs well, even when the $n \times k$ design matrix is large ( $nk = O (10^{8})$ ).
There are many triangulated categories that arise in the study
of group cohomology: the derived, stable or homotopy categories, for
example. In this talk I shall describe the relative cohomological
versions and the relationship with ordinary cohomology. I will explain
what we know (and what we would like to know) about these categories, and
how the representation type of certain subgroups makes a fundamental
difference.
Given an algebraic variety $X$ over the finite field ${\bf F}_{q}$, it is known that the zeta function of $X$,
$$ Z(X,T):=\mbox{exp}\left( \sum_{k=1}^{\infty} \frac{#X({\bf F}_{q^{k}})T^{k}}{k} \right) $$
is a rational function of $T$. It is an ongoing topic of research to efficiently compute $Z(X,T)$ given the defining equation of $X$.
I will summarize how we can use Berthelot's rigid cohomology (sparing you the actual construction) to compute $Z(X,T)$, first done for hyperelliptic curves by Kedlaya. I will go on to describe Lauder's deformation algorithm, and the promising fibration algorithm, outlining the present drawbacks.
Glauber dynamics on $\mathbb{Z}^d$ is a dynamic representation of the zero-temperature Ising model, in which the spin (either $+$ or $-$) of each vertex updates, at random times, to the state of the majority of its neighbours. It has long been conjectured that the critical probability $p_c(\mathbb{Z}^d)$ for fixation (every vertex eventually in the same state) is $1/2$, but it was only recently proved (by Fontes, Schonmann and Sidoravicius) that $p_c(\mathbb{Z}^d)