Tue, 25 Jan 2022

14:00 - 15:00
Virtual

The emergence of concepts in shallow neural-networks

Elena Agliari
(University of Rome Sapienza)
Abstract

In the first part of the seminar I will introduce shallow neural-networks from a statistical-mechanics perspective, focusing on simple cases and on a naive scenario where information to be learnt is structureless. Then, inspired by biological information processing, I will enrich this framework by accounting for structured datasets and by making the network able to perform challenging tasks like generalization or even "taking a nap”. Results presented are both analytical and numerical.

Tue, 18 Jan 2022

14:00 - 15:00
Virtual

FFTA: AI-Bind: Improving Binding Predictions for Novel Protein Targets and Ligands

Giulia Menichetti
(Northeastern University)
Abstract

Identifying novel drug-target interactions (DTI) is a critical and rate limiting step in drug discovery. While deep learning models have been proposed to accelerate the identification process, we show that state-of-the-art models fail to generalize to novel (i.e., never-before-seen) structures. We first unveil the mechanisms responsible for this shortcoming, demonstrating how models rely on shortcuts that leverage the topology of the protein-ligand bipartite network, rather than learning the node features. Then, we introduce AI-Bind, a pipeline that combines network-based sampling strategies with unsupervised pre-training, allowing us to limit the annotation imbalance and improve binding predictions for novel proteins and ligands. We illustrate the value of AI-Bind by predicting drugs and natural compounds with binding affinity to SARS-CoV-2 viral proteins and the associated human proteins. We also validate these predictions via auto-docking simulations and comparison with recent experimental evidence. Overall, AI-Bind offers a powerful high-throughput approach to identify drug-target combinations, with the potential of becoming a powerful tool in drug discovery.

arXiv link: https://arxiv.org/abs/2112.13168

Fri, 25 Feb 2022

16:00 - 17:00
L1

North Meets South

Pascal Heid and Ilyas Khan
Abstract

This event will be hybrid and will take place in L1 and on Teams. A link will be available 30 minutes before the session begins.

Pascal Heid
Title: Adaptive iterative linearised Galerkin methods for nonlinear PDEs

Abstract: A wide variety of iterative methods for the solution of nonlinear equations exist. In many cases, such schemes can be interpreted as iterative local linearisation methods, which can be obtained by applying a suitable linear preconditioning operator to the original nonlinear equation. Based on this observation, we will derive an abstract linearisation framework which recovers some prominent iteration schemes. Subsequently, in order to cast this unified iteration procedure into a computational scheme, we will consider the discretisation by means of finite dimensional subspaces. We may then obtain an effective numerical algorithm by an instantaneous interplay of the iterative linearisation and an (optimally convergent) adaptive discretisation method. This will be demonstrated by a numerical experiment for a quasilinear elliptic PDE in divergence form.   

 

Ilyas Khan
Title: Geometric Analysis: Curvature and Applications

Abstract: Often, one will want to find a geometric structure on some given manifold satisfying certain properties. For example, one might want to find a minimal embedding of one manifold into another, or a metric on a manifold with constant scalar curvature, to name some well known examples of this sort of problem. In general, these problems can be seen as equivalent to solving a system of PDEs: differential relations on coordinate patches that can be assembled compatibly over the whole manifold to give a globally defined geometric equation.

In this talk, we will present the theories of minimal surfaces and mean curvature flow as representative examples of the techniques and philosophy that geometric analysis employs to solve problems in geometry of the aforementioned type. The description of the theory will be accompanied by a number of examples and applications to other fields, including physics, topology, and dynamics. 

Fri, 18 Feb 2022

16:00 - 17:00
L1

Conferences and collaboration

Abstract

This event will be hybrid and will take place in L1 and on Teams. A link will be available 30 minutes before the session begins.

`Conferences and collaboration’ is a Fridays@4 group discussion. The goal is to have an open and honest conversion about the hurdles posed by these things, led by a panel of graduate students and postdocs. Conferences can be both exciting and stressful - they involve meeting new people and learning new mathematics, but can be intimidating new professional experiences. Many of us also will either have never been to one in person, or at least not been to one in the past two years. Optimistically looking towards the world opening up again, we thought it would be a good time to ask questions such as:
-Which talks should I go to?
-How to cope with incomprehensible talks. Is it imposter syndrome or is the speaker just bad?
-Should I/how should I go about introducing myself to more senior people in the field?
-How do you start collaborations? Does it happen at conferences or elsewhere?
-How do you approach workload in collaborations?
-What happens if a collaboration isn’t working out?
-FOMO if you like working by yourself. Over the hour we’ll have a conversation about these hurdles and most importantly, talk about how we can make conferences and collaborations better for everyone early in their careers.

Fri, 04 Feb 2022

16:00 - 17:00
L1

Careers outside of academia

Kim Moore (Faculty AI) and Sébastien Racanière (Google DeepMind)
Abstract

This event will take place on Teams. A link will be available 30 minutes before the session begins.

Sebastien Racaniere is a Staff Research Engineer at DeepMind. His current main interest is in the use of symmetries in Machine Learning. This offers diverse applications, for example in Neuroscience or Theoretical Physics (in particular Lattice Quantum Chromodynamics). Past interests, still in Machine Learning, include Reinforcement Learning (i.e. learning from rewards), generative models (i.e. learn to sample from probability distributions), and optimisation (i.e. how to find 'good' minima of functions)

 

Kim Moore is a senior data scientist at faculty, which is a data science consultancy based in London. As a data scientist, her role is to help our clients across sectors such as healthcare, government and consumer business solve their problems using data science and AI. This involves applying a variety of techniques, ranging from simple data analysis to designing and implementing bespoke machine learning algorithms. Kim will talk about day to day life at faculty, some interesting projects that I have worked on and why her mathematical background makes her a great data scientist.
Fri, 28 Jan 2022

16:00 - 17:00
L1

North Meets South

Kaibo Hu and Davide Spriano
Abstract

This event will be hybrid and will take place in L1 and on Teams. A link will be available 30 minutes before the session begins.

Kaibo Hu
Title: Complexes from complexes
Abstract:
Continuous and discrete (finite element) de Rham complexes have inspired key progress in the mathematical and numerical analysis of the Maxwell equations. In this talk, we derive new differential complexes from the de Rham complexes. These complexes have applications in, e.g., general relativity and continuum mechanics. Examples include the elasticity (Kröner or Calabi) complex, which encodes fundamental structures in Riemannian geometry and elasticity. This homological algebraic construction is inspired by the Bernstein-​Gelfand-Gelfand (BGG) machinery from representation theory. Analytic results, e.g., various generalisations of the Korn inequality, follow from the algebraic structures. We briefly discuss applications in numerical PDEs and other fields.

Davide Spriano

Title: Growth of groups.

Abstract:
Given a transitive graph, it is natural to consider how many vertices are contained in a ball of radius n, and to study how this quantity changes as n increases. We call such a function the growth of the graph.

In this talk, we will see some examples of growth of Cayley graph of groups, and survey some classical results. Then we will see a dichotomy in the growth behaviour of groups acting on CAT(0) cube complexes.  

Fri, 21 Jan 2022

16:00 - 17:00
L1

Thriving in, or perhaps simply surviving, academia: insights gained after nearly 40 years in STEM

Margot Gerritsen
(Stanford)
Abstract

This event will take place in L1 and on Teams. A link will be available 30 minutes before the session begins. 

 

It's hard to believe: I've spent nearly 40 years in STEM. In that time, much changed: we changed from typewriters to PCs, from low performance to high  performance computing, from data-supported research to data-driven research, from traditional languages such as Fortran to a plethora of programming environments. And the rate of change seems to increase constantly. Some things have stayed more or less the same, such as the (lack of) diversity of the STEM community, the level of stress and the struggles we all experience (and the joys!). In this talk, I will reflect on those years, on lessons learned and not learned or unlearned, on things I wish I understood 40 years ago, and on things I still don't understand.

Margot is a professor at Stanford University in the Department of Energy Resources Engineering (ERE) and the Institute of Computational & Mathematical Engineering (ICME). Margot was born and raised in the Netherlands. Her STEM education started in 1982. In 1990 she received a MSc in applied mathematics at Delft University and then left her home country to search for sunnier and hillier places. She moved to Colorado and a year later to California to join the PhD program in Scientific Computing and Computational Mathematics at Stanford. During her PhD, Margot spent several quarters at Oxford University (with very good memories). Before returning to Stanford as faculty member in ERE, Margot spent 5 years as lecturer at the University of Auckland, New Zealand. From 2010-2018, Margot was the director of ICME. During this directorship, she founded the Women in Data Science initiative, which is now a global organization in over 70 countries. From 2015-2020, Margot was also the Senior Associate Dean of Educational Affairs at Stanford's school of Earth, Energy & Environmental Sciences. Currently, Margot still co-directs WiDS and is the Chair of the Board of SIAM. She has since moved back to the mountains (still sunny too) and now lives in Bend, Oregon.

Tue, 18 Jan 2022
14:30
Virtual

Constrained optimization on Riemannian manifolds

Melanie Weber
(Mathematical Institute (University of Oxford))
Abstract

Many applications involve non-Euclidean data, where exploiting Riemannian geometry can deliver algorithms that are computationally superior to standard nonlinear programming approaches. This observation has resulted in an increasing interest in Riemannian methods in the optimization and machine learning community. In this talk, we consider the problem of optimizing a function on a Riemannian manifold subject to convex constraints. We introduce Riemannian Frank-Wolfe (RFW) methods, a class of projection-free algorithms for constrained geodesically convex optimization. To understand the algorithm’s efficiency, we discuss (1) its iteration complexity, (2) the complexity of computing the Riemannian gradient and (3) the complexity of the Riemannian “linear” oracle (RLO), a crucial subroutine at the heart of the algorithm. We complement our theoretical results with an empirical comparison of RFW against state-of-the-art Riemannian optimization methods. Joint work with Suvrit Sra (MIT).

Subscribe to