Skip to content

Congrats to Three Colleagues

August 30, 2023


And a fourth

Composite crop of src1, src2, src3

Vinod Vaikuntanathan and Santosh Vempala and Virginia Williams have a common thread. No, it’s not that they all have a name that starts with a V. Yes, they all are at an institute that is of the form:

X Institute of Technology

But it is really that they all have made important contributions to theory. The Simons Foundation has selected them as winners of the 2023 awards in theory.

They are at the end of the overall list of winners for 2023. No, it’s not because they have names that start with V, V, and W. Yes, it’s because the winners in each field are grouped together and theory comes last.

The Winners

Here are the citations for the three in Theoretical Computer Science:

  • Vinod Vaikuntanathan, Massachusetts Institute of Technology: Vinod Vaikuntanathan’s research is in the foundations of cryptography and its applications to theoretical computer science at large. He is known for his work on fully homomorphic encryption, a powerful cryptographic primitive that enables complex computations on encrypted data, as well as lattice-based cryptography, which lays down a new mathematical foundation for cryptography in the post-quantum world. Recently, he has been interested in the interactions of cryptography with quantum computing, as well as with statistics and machine learning.

  • Santosh Vempala, Georgia Institute of Technology: Santosh Vempala has made fundamental advances in the theory of algorithms: for sampling high-dimensional distributions, computing the volume of a convex body, optimization over convex sets, randomized matrix approximation, as well as basic problems in machine learning. In many cases, these were the first polynomial-time algorithms and co-evolved with insights into high-dimensional geometry and probability. Recent highlights include proving that sufficiently sparse linear systems can be solved faster than matrix multiplication ({Ax=b}, the workhorse of modern computation); extending sampling methods to non-Euclidean (Riemannian) geometries to make them faster (leading to practical methods in very high dimension); pioneering techniques for algorithmic robust statistics (immune to adversarial corruptions); and developing a rigorous theory of computation and learning in the brain in a biologically plausible model (how does the mind emerge from neurons and synapses?). He continues to be puzzled by whether an unknown polytope can be learned in polytime from samples, whether its diameter is bounded by a polynomial in its description length, whether its volume can be computed in polytime without randomization, and whether the answers to these questions will be discovered by humans or by AI.

  • Virginia Williams, Massachusetts Institute of Technology: Virginia Vassilevska Williams’ research is broadly in theoretical computer science, and more specifically in designing algorithms for graphs and matrices, and uncovering formal relationships between seemingly very different computational problems. She is known for her work on fast matrix multiplication, both in co-developing the world’s fastest algorithm and in proving limitations on the known techniques for designing matrix multiplication algorithms. Williams is also one of the original founders and leaders of fine-grained complexity, a blossoming new branch of computational complexity.

One More Makes Four

The areas of the other winners are: Mathematics, Physics, Theoretical Physics in Life Sciences, and Astrophysics. One of the winners in plain-vanilla Physics is another old friend:

  • Aram Harrow, Massachusetts Institute of Technology: Aram Harrow studies quantum computing and information. He works to understand the capabilities of the quantum computers and quantum communication devices we will build in the future, and in the process, he creates connections to other areas of theoretical physics, mathematics and computer science. He has developed quantum algorithms for solving large systems of linear equations and hybrid classical-quantum algorithms for machine learning. Harrow has also contributed to the intersection of quantum information and many-body physics, with work on thermalization, random quantum dynamics, and the “monogamy” property of quantum entanglement.

src

We interacted with Aram through almost the entire year 2012 as we hosted an eightpart debate between him and Gil Kalai on the scalability of universal quantum computing.

In his own work, one of the things he is famous for is the HHL algorithm with Avinatan Hassidim and Seth Lloyd. We covered it in a 2019 post. This is mainly what the line that he “has developed quantum algorithms for solving large systems of linear equations” is referring to.

Now wait a second: this is just the same thing as solving {Ax=b}. The citation above for Vempala calls that—in its classical setting—“the workhorse of modern computation.” If that’s so, then what to call the quantum {Ax=b}? Note that the quantum {Ax=b} is only approximate, and comes with other caveats.

Should quantum {Ax=b} be called “the unicorn of modern computation?”

All kidding aside, we congratulate Aram on this award.

Open Problems

And our congrats to all the award winners in all areas.

2 Comments leave one →
  1. August 30, 2023 5:34 pm

    hear, hear!

  2. javaid aslam permalink
    September 6, 2023 8:12 pm

    I find it hard to understand how fast matrix multiplication constitutes an advanced theory of anything.

Leave a Reply to javaid aslamCancel reply

Discover more from Gödel's Lost Letter and P=NP

Subscribe now to keep reading and get access to the full archive.

Continue reading