Costis Daskalakis received the Nevanlinna Prize last week for his groundbreaking results on computational complexity of Nash equilibrium and mechanism design for multi-unit auctions. Congrats Costis!
In this post we want to celebrate another aspect of Costis’s work, on tackling statistical and modeling questions at the intersection of statistics, machine learning, and theory. Indeed, over the past years Costis and his students and collaborators have been at the forefront of some fundamental, yet quite topical questions: how to make sense of data when we have too little of it, or too little time?
On topics ranging from the daunting “curse of dimensionality” (how to say something meaningful about high-dimensional data, given limited computational power and/or observations? Under which assumptions, and in which scenarii can one still have a principled and sound approach to hypothesis testing, or density estimation, in this case?) to societal issues such as the tension between efficiency and privacy in hypothesis testing (is such a tension even necessary, or can we sometimes get differential privacy at no cost?), while exploring applications to biology and inference on genomic data, Costis’ contributions to these broad questions have been many.
Eagerly waiting for the next breakthroughs, once again — congratulations on this well-deserved award!
(Image credit: Sarah A. King, from the MIT Technology Review.)