The following Test of Time Awards will be given at STOC 2025.


The 30-year Test-of Time award recognizes two seminal papers published in 1995:

The main result of Raz’s paper on parallel repetition is that the value of a 2-player game decays exponentially when repeated in parallel, thus enabling a boost in soundness while not increasing the round complexity. In conjunction with the PCP Theorem, the result has been the basis for a large fraction of approximation results for NP-hard problems over the last three decades. Raz’s paper also gave rise to the notion of smoothed parallel repetition, which has played an important role in recent work on PCPs. While the result has been improved, most of these subsequent improvements use the same information-theoretic approach, which Raz pioneered in the paper. The techniques introduced in this paper have also played an important role in communication complexity and direct product testing.

The paper by Hannenhalli and Pevzner gave the first polynomial-time algorithm for solving the Sorting By Reversal Problem, a fundamental combinatorial problem used in biology to understand the common evolutionary history of two generic sequences. The techniques in the paper used deep graph-theoretic insights and became the foundation of further algorithmic work in understanding genome rearrangements over the next three decades. This work also inspired a series of experimental studies and subsequent extensions to the original problem. Remarkably, the theory that emerged from this paper had a profound impact on models used by evolutionary biologists in understanding the basic mechanisms that drive genome assembly.


The 20-year Test-of-Time award recognizes a seminal paper that was published in STOC 2005:

In this landmark paper, Regev introduced the Learning With Errors (LWE) problem, which became the basis for a wealth of important developments in theoretical and practical cryptography over the last twenty years. The paper established the average-case hardness of LWE based on the worst-case hardness of problems believed to be hard even for quantum computers, paving the way for post-quantum cryptography. In his paper, Regev provided a new cryptosystem based on LWE that was simpler and more efficient than previous lattice-based schemes. Since then the paper has had wide-ranging influence. LWE opened up a new paradigm for designing novel cryptographic primitives which were previously unattainable. LWE has also become the basis for the design of quantum protocols, such as protocols for verifying quantum computation and generating provably random strings. Moreover, several leading candidates for practical post-quantum cryptosystems are based on LWE and its variants.


The 10-year Test-of-Time award recognizes three seminal papers published in STOC 2015:

The paper on Preserving Statistical Validity in Adaptive Data Analysis studies the fundamental problem in statistics of how to perform adaptive analysis on a dataset without overfitting. This work provided the first principled framework to study this problem, establishing that differentially private computations can be used towards ensuring statistical validity. The deep connection between differential privacy and statistics demonstrated that privacy is not only important as an end goal but can also improve seemingly unrelated aspects of statistical analysis. This paper inspired new lines of work on differential privacy and generalization and influenced how practitioners think about data reuse.

The paper on Hypercontractivity, Sum-of-Squares Proofs and Applications pioneered the proofs-to-algorithms framework based on the sum-of-squares hierarchy. They showed that if a proof of existence can be encapsulated as a system of polynomial inequalities with certain properties, one immediately gets an algorithm. The paper uses this paradigm to tackle problems involving matrix hypercontractive norms, small-set expansion, unique games, multivariate polynomial optimization, and quantum information. Since then, the sum-of-squares framework has inspired an explosion of subsequent algorithmic developments over the last decade and has had an important impact on many fields including hardness of approximation, statistical inference, combinatorial optimization, quantum information theory, and machine learning.

The paper on Electrical flows, laplacian systems, and faster approximation of maximum flow in undirected graphs provided the first polynomial improvement for approximately solving maximum flow in sparse, capacitated, undirected graphs in decades. Moreover, the paper gave a proof of concept that optimization methods for Laplacian linear systems could be leveraged as an important tool in algorithmic graph theory. This observation inspired the powerful integration of continuous optimization methods with combinatorial graph techniques, thereby substantially enriching the algorithm design toolkit. The techniques introduced in this paper led to subsequent improvements for analogous multicommodity flow problems, faster algorithms for undirected graphs, new algorithms for directed graphs, and, recently, almost linear-time algorithms for maximum flow.


Award Committee:

Sandy Irani (chair, UC Irvine)
Ran Canetti (Boston U)
Mike Paterson (Warwick)
David Shmoys (Cornell)