Path Thresholding (PaTh)

Selecting tuning parameters in sparse regression algorithms is known to be a difficult task since the optimal tuning parameter either depends on the unknown sparsity level of the sparse vector or on the unknown noise variance. In practice, practitioners resort to either cross-validation (CV) or information-criterion based methods. Both these methods are known to be suboptimal in the high-dimensional setting where the number of variables is much greater than the number of observations. Although stability selection is suitable for high-dimensional problems, it can require significant computations.

PaTh has two appealing properties:
(1) It can be used with ANY sparse regression algorithm.
(2) It transforms the model dependent tuning parameter into a parameter that is a constant

When given a sufficient number of observations, while still in the high-dimensional regime, the performance of PaTh is independent of any tuning parameter. In other settings, PaTh can significantly reduce the possible number of sparse solutions. The code is written in such a way that PaTh can be used with any implementation of a sparse regression algorithm. In particular, we demonstrate how to use the code with the Lasso, OMP, and SWAP.

Reference:
Divyanshu Vats and Richard G. Baraniuk, "Path Thresholding: Asymptotically Tuning-Free High-Dimensional Sparse Regression," in Proceedings of the 17th International Conference on Artificial Intelligence and Statistics (AISTATS), 2014

Authors: Divyanshu Vats, Richard G. Baraniuk


Copyright ©2014, DSP Group, Rice University

Rice University, MS-380 - 6100 Main St - Houston, TX 77005 - USA - webmaster-dsp@ece.rice.edu