Dmitriy (Tim) Kunisky Talk — Spring 2023 Invited Speaker Series

Title: What average-case optimization can tell number theory

Time: 11:00am – 12:00pm

Location: Mohler Lab, Room 453

Abstract:
The Paley graph is a classical number-theoretic construction of a graph that is believed to behave “pseudorandomly” in many regards. Accurately bounding the clique number of the Paley graph is a long-standing open problem in number theory, with applications to several other questions about the statistics of finite fields. I will present recent results studying the application of optimization methods to this problem, an illustration of the broader phenomenon of convex optimization being a useful proof technique in extremal combinatorics. First, we will consider sum-of-squares (SOS) relaxations of the clique number. I will show limitations of the degree 4 SOS relaxation in bounding the clique number, as well as formal and numerical evidence that this relaxation might still improve on the state of the art for the Paley graph. Next, we will consider simpler spectral bounds on the clique number, augmented with a “localization” to small subgraphs of the input graph. I will present predictions and partial results, inspired by free probability techniques in random matrix theory, on pseudorandomness of the spectra of induced subgraphs of the Paley graph, which suggest that these spectral bounds may be very fruitful. I will conclude by proposing a class of open problems in average-case optimization that would help understand the power and limitations of this approach.

Based partly on joint work with Xifan Yu.

Bio: Dmitriy (Tim) Kunisky is a postdoctoral associate at Yale University’s Department of Computer Science, hosted by Daniel Spielman. Before joining Yale, he received his PhD in Mathematics from the Courant Institute at New York University, advised by Afonso Bandeira and Gerard Ben Arous. He is broadly interested in both heuristic and rigorous methods for assessing the computational hardness of random optimization problems, especially problems that can be formulated as convex optimization. More recently, he has also studied universality and derandomization techniques for showing that the phenomena governing random problems also occur in many semi-random and deterministic settings.

Leave a Reply