Author Archives: jkh6

NAE Website - National Academy of Engineering Elects 67 Members and 12  Foreign MembersRichard Baraniuk has been elected to the National Academy of Engineering in recognition of his contributions to engineering "for the development and broad dissemination of open educational resources and for foundational contributions to compressive sensing." Election to the National Academy of Engineering is among the highest professional distinctions accorded to an engineer. More from Rice News.

Richard Baraniuk will present the 2023 AMS Josiah Willard Gibbs Lecture at the Joint Mathematics Meeting in Boston, Massachusetts in January 2023.  The first AMS Josiah Willard Gibbs Lecture was given in 1923. This public lecture is one of the signature events in the Society’s calendar.  Previous speakers have included Albert Einstein, Vannevar Bush, John von Neumann, Norbert Wiener, Kurt Gödel, Hermann Weyl, Eugene Wigner, Donald Knuth, Herb Simon, David Mumford, Ingrid Daubechies, and Claude Shannon.

Richard G. Baraniuk, the C. Sidney Burrus Professor of Electrical and Computer Engineering (ECE) and founding director of OpenStax, Rice’s educational technology initiative, has received the Harold W. McGraw, Jr. Prize in Education.  The award is given annually by the Harold W. McGraw, Jr. Family Foundation and the University of Pennsylvania Graduate School of Education and goes to “outstanding individuals whose accomplishments are making a difference in the lives of students.”  Baraniuk is one of the founders of the Open Education movement that promotes the use of free and open-source-licensed Open Educational Resources. He is founder and director of OpenStax (formerly Connexions), a non-profit educational and scholarly publishing project he founded in 1999 to bring textbooks and other learning materials into the digital age.

D. LeJeune, H. Javadi, R. G. Baraniuk, "The flip side of the reweighted coin: Duality of adaptive dropout and regularization," NeurIPS 2021, arXiv:2106.0776.

Among the most successful methods for sparsifying deep (neural) networks are those that adaptively mask the network weights throughout training. By examining this masking, or dropout, in the linear case, we uncover a duality between such adaptive methods and regularization through the so-called "η-trick" that casts both as iteratively reweighted optimizations. We show that any dropout strategy that adapts to the weights in a monotonic way corresponds to an effective subquadratic regularization penalty, and therefore leads to sparse solutions. We obtain the effective penalties for several popular sparsification strategies, which are remarkably similar to classical penalties commonly used in sparse optimization. Considering variational dropout as a case study, we demonstrate similar empirical behavior between the adaptive dropout method and classical methods on the task of deep network sparsification, validating our theory.

P. K. Kota, D. LeJeune, R. A. Drezek, R. G. Baraniuk, "Extreme Compressed Sensing of Poisson Rates from Multiple Measurements," arxiv.org/abs/2103.08711, March 15, 2021.

Compressed sensing (CS) is a signal processing technique that enables the efficient recovery of a sparse high-dimensional signal from low-dimensional measurements. In the multiple measurement vector (MMV) framework, a set of signals with the same support must be recovered from their corresponding measurements. Here, we present the first exploration of the MMV problem where signals are independently drawn from a sparse, multivariate Poisson distribution. We are primarily motivated by a suite of biosensing applications of microfluidics where analytes (such as whole cells or biomarkers) are captured in small volume partitions according to a Poisson distribution. We recover the sparse parameter vector of Poisson rates through maximum likelihood estimation with our novel Sparse Poisson Recovery (SPoRe) algorithm. SPoRe uses batch stochastic gradient ascent enabled by Monte Carlo approximations of otherwise intractable gradients. By uniquely leveraging the Poisson structure, SPoRe substantially outperforms a comprehensive set of existing and custom baseline CS algorithms. Notably, SPoRe can exhibit high performance even with one-dimensional measurements and high noise levels. This resource efficiency is not only unprecedented in the field of CS but is also particularly potent for applications in microfluidics in which the number of resolvable measurements per partition is often severely limited. We prove the identifiability property of the Poisson model under such lax conditions, analytically develop insights into system performance, and confirm these insights in simulated experiments. Our findings encourage a new approach to biosensing and are generalizable to other applications featuring spatial and temporal Poisson signals.

Rice DSP faculty Yingyan Lin has received an NSF CAREER award for her project "Differentiable Network-Accelerator Co-Search – Towards Ubiquitous On-Device Intelligence and Green AI."  The project has two main aims:  first, to bridge the vast gap between deep learning's prohibitive computational and energy complexity and the constrained resources of consumer devices, and second to reduce the sizable environmental pollution that stems from energy-intensive deep learning training.

DSP alum Justin Romberg (PhD, 2003), Schlumberger Professor Electrical and Computer Engineering at Georgia Tech, has been awarded the 2021 IEEE Jack S. Kilby Medal. He and his co-awardees Emmanuel Candes of Stanford University and Terrance Tao of UCLA will receive the highest honor in the field of signal processing for "groundbreaking contributions to compressed sensing."

Justin joins Rice DSP alum Jim McClellan (PhD, 1973), John and Marilu McCarty Chair of Electrical Engineering at Georgia Tech, and Rice DSP emeritus faculty member C. Sidney Burrus  as recipients of this honor.