Rice DSP graduate student Jasper Tan successfully defended his PhD thesis entitled "Privacy-Preserving Machine Learning: The Role of Overparameterization and Solutions in Computational Imaging.”
Abstract: While the accelerating deployment of machine learning (ML) models brings benefits to various aspects of human life, it also opens the door to serious privacy risks. In particular, it is sometimes possible to reverse engineer a given model to extract information about the data on which it was trained. Such leakage is especially dangerous if the model's training data contains sensitive information, such as medical records, personal media, or consumer behavior. This thesis is concerned with two big questions around this privacy issue: (1) "what makes ML models vulnerable to privacy attacks?" and (2) "how do we preserve privacy in ML applications?". For question (1), I present detailed analysis on the effect increased overparameterization has on a model's vulnerability to the membership inference (MI) privacy attack, the task of identifying whether a given point is included in the model's training dataset or not. I theoretically and empirically show multiple settings wherein increased overparameterization leads to increased vulnerability to MI even while improving generalization performance. However, I then show that incorporating proper regularization while increasing overparameterization can eliminate this effect and can actually increase privacy while preserving generalization performance, yielding a "blessing of dimensionality’" for privacy through regularization. For question (2), I present results on the privacy-preserving techniques of synthetic training data simulation and privacy-preserving sensing, both in the domain of computational imaging. I first present a training data simulator for accurate ML-based depth of field (DoF) extension for time-of-flight (ToF) imagers, resulting in a 3.6x increase in a conventional ToF camera's DoF when used with a deblurring neural network. This simulator allows ML to be used without the need for potentially private real training data. Second, I propose a design for a sensor whose measurements obfuscate person identities while still allowing person detection to be performed. Ultimately, it is my hope that these findings and results take the community one step closer towards the responsible deployment of ML models without putting sensitive user data at risk.
Jasper’s next step is the start-up GLASS Imaging, where he will be leveraging machine learning and computational photography techniques to design a new type of camera with SLR image quality that can fit in your pocket.