C. A. Metzler, A. Mousavi, R. Heckel, and R. G. Baraniuk, “Unsupervised Learning with Stein’s Unbiased Risk Estimator,” https://arxiv.org/abs/1805.10531, June 2018.
Appearing today on NSF's Science360 News: Taking a closer look with a lens-free fluorescent microscope
J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, A. Veeraraghavan, “Single-Frame 3D Fluorescence Microscopy with Ultraminiature Lensless FlatScope,” Science Advances, Vol. 3, No. 12, 8 December 2017.
Abstract: Modern biology increasingly relies on fluorescence microscopy, which is driving demand for smaller, lighter, and cheaper microscopes. However, traditional microscope architectures suffer from a fundamental trade-off: As lenses become smaller, they must either collect less light or image a smaller field of view. To break this fundamental trade-off between device size and performance, we present a new concept for three-dimensional (3D) fluorescence imaging that replaces lenses with an optimized amplitude mask placed a few hundred micrometers above the sensor and an efficient algorithm that can convert a single frame of captured sensor data into high-resolution 3D images. The result is FlatScope: perhaps the world’s tiniest and lightest microscope. FlatScope is a lensless microscope that is scarcely larger than an image sensor (roughly 0.2 g in weight and less than 1 mm thick) and yet able to produce micrometer-resolution, high–frame rate, 3D fluorescence movies covering a total volume of several cubic millimeters. The ability of FlatScope to reconstruct full 3D images from a single frame of captured sensor data allows us to image 3D volumes roughly 40,000 times faster than a laser scanning confocal microscope while providing comparable resolution. We envision that this new flat fluorescence microscopy paradigm will lead to implantable endoscopes that minimize tissue damage, arrays of imagers that cover large areas, and bendable, flexible microscopes that conform to complex topographies.
Fig. 1. (A) Traditional microscopes capture the scene through an objective and tube lens (~20 to 460 mm), resulting in a quality image directly on the imaging sensor. (B) FlatScope captures the scene through an amplitude mask and spacer (~0.2 mm) and computationally reconstructs the image. Scale bars, 100 μm (inset, 50 μm). (C) Comparison of form factor and resolution for traditional lensed research microscopes, GRIN lens microscope, and FlatScope. FlatScope achieves high-resolution imaging while maintaining a large ratio of FOV relative to the cross-sectional area of the device (see Materials and Methods for elaboration). Microscope objectives are Olympus MPlanFL N (1.25×/2.5×/5×, NA = 0.04/0.08/0.15), Nikon Apochromat (1×/2×/4×, NA = 0.04/0.1/0.2), and Zeiss Fluar (2.5×/5×, NA = 0.12/0.25). (D) FlatScope prototype (shown without absorptive filter). Scale bars, 100 μm.
Rice DSP PhD Andrew Lan (PhD, 2016) has accepted an Assistant Professor position at UMass Amherst in the Department of Computer Science. He has spent the past two years as a postdoc at Princeton University and OpenStax. His research interests revolve around the development of human-in-the-loop machine learning methods that enable scalable, effective, and fail-safe personalized learning in education. Andrew joins DSP PhD alum Marco Duarte at UMass.
Rice DSP postdoc Gautam Dasarathy has accepted an Assistant Professor position at Arizona State University in the Department of Electrical and Computer Engineering. His research interests lie at the intersection of machine learning, signal processing, statistics, and information theory. He is particularly interested in the integrated design of learning algorithms and data acquisition systems (e.g., active learning) that involve both machine and human components.
Dr. Akane Sano from the MIT Media Lab will be joining the Rice ECE department in Fall 2018 as an Assistant Professor. She is currently a Research Scientist in the Affective Computing Group at MIT and a visiting scientist/lecturer in the People-Aware Computing Lab at Cornell University. Her research develops new technologies to measure, forecast, understand and improve health and wellbeing using multi-modal wearable and mobile human sensing, data analysis/modeling and application development. She has focused on measuring and predicting stress, mental health, and sleep performance for the purpose of improving overall wellbeing.
Dr. Yingyan Lin will join the Rice ECE department as an Assistant Professor in Fall 2018. She is currently the Texas Instruments Visiting Assistant Professor at Rice. Her research interests are in the areas of energy-efficient machine learning systems for cloud and mobile computing, analog and mixed-signal circuits, error resiliency techniques, and VLSI circuits and architectures for machine learning systems on resource-constrained platforms. She received her PhD in 2017 from the University of Illinois at Urbana-Champaign.
Dr. Santiago Segarra from the MIT Institute for Data, Systems, and Society will join the Rice ECE department in Fall 2018 as an Assistant Professor. His research interests include network theory, data science, machine learning, and graph signal processing. His research focuses in networks, ubiquitous data structures that relate to numerous fields, including engineering, economics, sociology, and medicine. Santiago earned his PhD from the University of Pennsylvania in 2016.
DSP graduate student Lorenzo Luzi has been awarded an NSF Graduate Fellowship to support his research program in machine learning and signal processing. Before joining Rice, Lorenzo received the BSEE degree at Washington State University and interned at DOE's Pacific Northwest National Laboratory (PNNL).
The National Science Foundation Expeditions in Computing Program has awarded $10 million to a Rice University-led team that plans to create wearable and point-of-care microscopes that use on-chip illumination and sensing to non-invasively aid in the diagnosis and monitoring of nearly 100 health conditions that today require a biopsy or blood test.
“The project will produce a platform technology for in vivo, 3-D tissue imaging, with the aim of being able to point a camera to a part of the body and see live biology below the skin without making an incision or drawing blood,” said Rice Professor Ashutosh Sabharwal, the principal investigator on the grant.
The team of 11 co-investigators from Rice, Carnegie Mellon, Harvard, MIT and Cornell is one of three groups to win new five-year grants today from the NSF’s Expeditions in Computing program. Expeditions is an interdisciplinary NSF effort that constitutes the agency’s largest single investment in computer and information science research. Rice co-investigators include Richard Baraniuk, Rebecca Richards-Kortum, and Lin Zhong. Carnegie Mellon co-investigators include Artur Dubrawski, Ioannis Gkioulekas and Rice DSP alum Aswin Sankaranarayanan. Additional co-investigators include Cornell’s Al Molnar, Harvard’s Latanya Sweeney, and MIT’s Ramesh Raskar.
Anyone who has pointed a flashlight at their palm to make their hand glow knows that light can travel through the body. But visible light scatters so much as it passes through soft tissue that it has not been useful for medical imaging. The team aims to unravel this scattered light puzzle with a technique the team calls “computational scatterography” that is a combination of machine learning algorithms and new camera designs.
“Basically, we’re trying to ‘de-scatter’ the light,” said computational imaging expert and Rice DSP faculty member Ashok Veeraraghavan, associate professor of electrical and computer engineering. “In engineering, we call this an inverse problem. Geoscientists use similar inverse techniques on seismic waves to resolve pictures of Earth’s deep interior. Our task, in some ways, is even more complicated because the amount of light scattering that takes place in even a few millimeters of tissue far exceeds other problems.”
White blood cell count (WBC) tests are an example of the project’s potential impact. In the U.S., oncologists use millions of WBC tests each week to monitor chemotherapy patients. WBC tests require a finger prick or blood draw and a laboratory, which means they can be performed only at hospitals and clinics. “Imagine a wearable device no larger than a watch that uses sensors to continuously measure white blood cell count and wirelessly communicate with the oncologist’s office,” Sabharwal said. “The patient could go about their daily life. They’d only have to go to the hospital if there was a problem.”
For more information, visit seebelowtheskin.rice.edu