Large Margin Mechanism for Differentially Private Maximization

Friday, September 26, 2014
11:00am - 12:00pm
UTA 7.532

A basic problem in the design of privacy-preserving algorithms is the private maximization problem: the goal is to pick an item from a universe that (approximately) maximizes a data-dependent function, all under the constraint of differential privacy. This problem is central to many privacy-preserving algorithms for statistics and machine learning.

Previous algorithms for this problem are either range-dependent---i.e., their utility diminishes with the size of the universe---or only apply to very restricted function classes.  Prof. Hsu describes a new, general-purpose and range-independent algorithm for private maximization that guarantees approximate differential privacy. He will also describe applications to private frequent itemset mining and private PAC learning.

This is joint work with Kamalika Chaudhuri and Shuang Song.


Assistant Professor
Columbia University

Daniel J. Hsu is an assistant professor in the Department of Computer Science at Columbia University. Previously, he was a postdoc at Microsoft Research New England from 2011 to 2013; before that, he was a postdoc with the Department of Statistics at Rutgers University and the Department of Statistics at the University of Pennsylvania from 2010 to 2011, supervised by Tong Zhang and Sham M. Kakade. He received his Ph.D. in Computer Science in 2010 from the Department of Computer Science and Engineering at UC San Diego, where he was advised by Sanjoy Dasgupta. He received his B.S. in Computer Science and Engineering in 2004 from the Department of Electrical Engineering and Computer Sciences at UC Berkeley.

His research interests are in algorithmic statistics and machine learning.