Integer programming, the problem of finding an optimal integer solution satisfying linear constraints, is one of the most fundamental problems in discrete optimization. In the first part of this talk, I will discuss the important open problem of whether there exists a single exponential time algorithm for solving a general n variable integer program, where the best current algorithm requires n^{O(n)} time. I will use this to motivate a beautiful conjecture of Kannan & Lovasz (KL) regarding how "flat" convex bodies not containing integer points must be.

The l_2 case of KL was recently resolved in breakthrough work by Regev & Davidowitz `17, who proved a more general "Reverse Minkowski" theorem which gives an effective way of bounding lattice point counts inside any ball around the origin as a function of sublattice determinants. In both cases, they prove the existence of certain "witness" lattice subspaces in a non-constructive way that explains geometric parameters of the lattice. In this work, as my first result, I show how to make these results constructive in 2^{O(n)} time, i.e. which can actually find these witness subspaces, using discrete Gaussian sampling techniques. As a second main result, I show an improved complexity characterization for approximating the covering radius of a lattice, i.e. the farthest distance of any point in space to the lattice. In particular, assuming the slicing conjecture, I show that this problem is in coNP for constant approximation factor, which improves on the corresponding O(log^{3/2} n) approximation factor given by Regev & Davidowitz's proof of the l_2 KL conjecture.

# Past Cryptography Seminar

With the evolution towards the IoT and cyber-physical systems, the role that the underlying hardware plays in securing an application is becoming more prominent. Hardware can be used constructively, e.g. for accelerating computationally- intensive cryptographic algorithms. Hardware can also be used destructively, e.g., for physical attacks or transistor-level Trojans which are virtually impossible to detect. In this talk, we will present case studies for high-speed cryptography used in car2x communication and recent research on low-level hardware Trojans.

Broken cryptographic algorithms and hardness assumptions are a constant

threat to real-world protocols. Prominent examples are

hash functions for which collisions become known, or number-theoretic

assumptions which are threatened by advances in quantum computing.

Especially when it comes to key exchange protocols, the switch to

quantum-resistant primitives has begun and aims to protect today’s

secrets against future developments, moving from common Diffie–Hellman

based solutions to Learning-With-Errors-based approaches. Remarkably,

the authentication step in such protocols is usually still carried out

with quantum-vulnerable signature schemes. The intuition here is that

the adversary would need to break this protocol primitive today, without

having quantum power yet. The question we address here is if this

intuition is justified, and if so, if we can show this rigorously. We

particularly consider the authenticated variant of the recently

introduced post-quantum secure key exchange protocol NewHope (Alkim et

al., USENIX Security 2016), as well as by TLS 1.3, which is currently

being developed by the Internet Engineering Task Force.

Adaptive oblivious transfer (OT) is a protocol where a sender

initially commits to a database {M_i}_{i=1}^N . Then, a receiver can query the

sender up to k times with private indexes ρ_1, …, ρ_k so as to obtain

M_{ρ_1}, …, M_{ρ_k} and nothing else. Moreover, for each i ∈ [k], the receiver’s

choice ρ_i may depend on previously obtained messages {M_ρ_j}_{j<i} . Oblivious transfer

with access control (OT-AC) is a flavor of adaptive OT

where database records are protected by distinct access control policies

that specify which credentials a receiver should obtain in order to access

each M_i . So far, all known OT-AC protocols only support access policies

made of conjunctions or rely on ad hoc assumptions in pairing-friendly

groups (or both). In this paper, we provide an OT-AC protocol where access policies may consist of any branching program of polynomial length, which is sufficient to realize any access policy in NC^1. The security of

our protocol is proved under the Learning-with-Errors (LWE) and Short-

Integer-Solution (SIS) assumptions. As a result of independent interest,

we provide protocols for proving the correct evaluation of a committed

branching program on a committed input.

Joint work with Benoît Libert, San Ling, Khoa Nguyen and Huaxiong Wang.

Identity-based cryptography can be useful in situations where a full-scale public-key infrastructure is impractical. Original identity-based proposals relied on elliptic curve pairings and so are vulnerable to quantum computers. I will describe some on-going work to design a post-quantum identity-based encryption scheme using ideas from Ring Learning with Errors. Our scheme has the advantage that it can be extended to the hierarchical setting for more flexible key management.

In this lecture, we present practical and provably

secure (authenticated) key exchange protocol and password

authenticated key exchange protocol, which are based on the

learning with errors problems. These protocols are conceptually

simple and have strong provable security properties.

This type of new constructions were started in 2011-2012.

These protocols are shown indeed practical. We will explain

that all the existing LWE based key exchanges are variants

of this fundamental design. In addition, we will explain

some issues with key reuse and how to use the signal function

invented for KE for authentication schemes.

Direct Anonymous Attestation (DAA) is a protocol that allows a security chip embedded in a platform such as laptop to authenticate itself as a genuine chip. Different authentications are not linkeable, thus the protocol protects the privacy of the platform. The first DAA protocol was proposed by Brickell, Chen, and Camenisch and was standardized in 2004 by the Trusted Computing Group (TCG). Implementations of this protocols were rather slow because it is based on RSA. Later, alternative and faster protocols were proposed based on elliptic curves. Recently the specification by the TCG was updated to allow for DAA protocols based on elliptic curves. Unfortunately, the new standard does not allow for provably secure DAA protocols. In this talk, we will review some of the history of DAA and then discuss the latest protocols, security models, and finally a provably secure realization of DAA based on elliptic curves.

Hash Proof Systems or Smooth Projective Hash Functions (SPHFs) are a

form of implicit arguments introduced by Cramer and Shoup at

Eurocrypt’02. They have found many applications since then, in

particular for authenticated key exchange or honest-verifier

zero-knowledge proofs. While they are relatively well understood in

group settings, they seem painful to construct directly in the lattice

setting.

Only one construction of an SPHF over lattices has been proposed, by

Katz and Vaikuntanathan at Asiacrypt’09. But this construction has an

important drawback: it only works for an ad-hoc language of ciphertexts.

Concretely, the corresponding decryption procedure needs to be tweaked,

now requiring q many trapdoor inversion attempts, where q is the modulus

of the underlying Learning With Error (LWE) problem.

Using harmonic analysis, we explain the source of this limitation, and

propose a way around it. We show how to construct SPHFs for standard

languages of LWE ciphertexts, and explicit our construction over a

tag-CCA2 encryption scheme à la Micciancio-Peikert (Eurocrypt’12).

If there is enough time, we will conclude with applications of these

SPHFs to password-authenticated key exchange, honest-verifier

zero-knowledge and a variant of witness encryption.

TBA

In this talk, I’ll share the progress that we have made in the field of e-voting, including the proposal of a new paradigm of e-voting system called self-enforcing e-voting (SEEV). A SEEV system is End-to-End (E2E) verifiable, but it differs from all previous E2E systems in that it does not require tallying authorities. The removal of tallying authorities significantly simplifies the election management and makes the system much more practical than before. A prototype of a SEEV system based on the DRE-i protocol (Hao et al. USENIX JETS 2014) has been built and used regularly in Newcastle University for classroom voting and student prize competitions with very positive student feedback. Lessons from our experience of designing, analysing and deploying an e-voting system for real-world applications are also presented.