Finite element schemes and mesh smoothing for geometric evolution problems
Abstract
Geometric evolutions can arise as simple models or fundamental building blocks in various applications with moving boundaries and time-dependent domains, such as grain boundaries in materials or deforming cell boundaries. Mesh-based methods require adaptation and smoothing, particularly in the case of strong deformations. We consider finite element schemes based on classical approaches for geometric evolution equations but augmented with the gradient of the Dirichlet energy or a variant of it, which is known to produce a tangential mesh movement beneficial for the mesh quality. We focus on the one-dimensional case, where convergence of semi-discrete schemes can be proved, and discuss two cases. For networks forming triple junctions, it is desirable to keep the impact of any additional, mesh smoothing terms on the geometric evolution as small as possible, which can be achieved with a perturbation approach. Regarding the elastic flow of curves, the Dirichlet energy can serve as a replacement of the usual penalty in terms of the length functional in that, modulo rescaling, it yields the same minimisers in the long run.
two-dimensional ODEs
invasion: Simulations and comparisons
Group-invariant tensor train networks for supervised learning
Abstract
Invariance under selected transformations has recently proven to be a powerful inductive bias in several machine learning models. One class of such models are tensor train networks. In this talk, we impose invariance relations on tensor train networks. We introduce a new numerical algorithm to construct a basis of tensors that are invariant under the action of normal matrix representations of an arbitrary discrete group. This method can be up to several orders of magnitude faster than previous approaches. The group-invariant tensors are then combined into a group-invariant tensor train network, which can be used as a supervised machine learning model. We applied this model to a protein binding classification problem, taking into account problem-specific invariances, and obtained prediction accuracy in line with state-of-the-art invariant deep learning approaches. This is joint work with Brent Sprangers.
appendix with Dawid Kielak