Portable Integrated Fourier Ptychographic Microscope
Optics
Portable Integrated Fourier Ptychographic Microscope (GSC-TOPS-364)
AI-enabled system offers near real-time, sub-micron imaging
Overview
Fourier ptychographic microscopy (FPM) seeks to address the tradeoff between resolution and field-of-view (FoV) experienced by conventional imaging methods. The technique combines multiple low-resolution images captured under varying illumination angles to reconstruct a single, high-resolution, wide FoV image. Key challenges in FPM devices include lack of portability (preventing use in the field) and long image reconstruction times (preventing real-time feedback, decreasing throughput and efficiency, and leading to poor user experience that presents a barrier to adoption).
Originally designed for biosignature detection and motility in liquid samples, NASA’s Goddard Space Flight Center has developed a compact, portable inferencing device comprised of a FPM integrated with a microfluidic device to perform wide FoV, sub-micron spatial resolution imaging of several different sample types. NASA has significantly improved FPM device technology by creating a small portable device that leverages algorithms to perform self-calibration of LED positions and trained, sample type-specific neural networks for recording and image reconstruction. These algorithmic optimizations allow for near real-time image reconstruction.
The Technology
NASA’s integrated, portable FPM device combines advanced optical microscopy with AI in a compact form factor. At its core, the system uses a Raspberry Pi camera module equipped with an 8-megapixel sensor and a 3-mm focal-length lens, achieving approximately 1.5× magnification. Illumination is provided by a Unicorn HAT HD LED array positioned 65mm below the sample stage, creating a synthetic numerical aperture of 0.55. All these components are controlled by an NVIDIA Jetson Nano board, which serves as the system's embedded AI computing platform. NASA’s portable FPM device can be integrated with a microfluidic system for sub-micron imaging of liquid samples.
In addition to its portability, what sets NASA’s integrated FPM device apart is its integration of deep learning capabilities. The invention has two modes – normal mode and deep learning mode. In its normal mode, the system captures data and performs intensity and 3D phase analysis using traditional FPM methods. The deep learning mode enhances this base functionality by employing neural networks for image reconstruction and system optimization. The AI system automatically detects when samples are out of focus and can either mechanically or digitally adjust to the correct focal plane. To achieve near real-time monitoring, deep learning models significantly reduce data acquisition time by selectively using only a portion of the LED array and provide fast reconstruction capabilities leveraging training on specific sample types.
While originally designed for imaging biosignature motility in liquid samples for spaceflight applications, this NASA innovation can image many different types of samples, and is not limited to biological specimens. The ability to operate this system in the field further broadens use-cases.


Benefits
- Near real-time imaging: By leveraging AI/ML technology, NASA’s portable FPM can generate reconstructed imagery in near real-time, much faster than conventional, more computationally expensive algorithms. This may open up new applications such as the high-resolution imaging of cellular processes.
- Spatial resolution: The device achieves sub-micron resolution, critical for detecting biosignatures and other objects of interest.
- Portable and compact: NASA’s FPM is a small form factor (~30 x 15 x 15 cm) device with low mass and power consumption. Additionally, integrated hardware runs the deep learning models locally, eliminating the need to transfer data outside the system for processing (enabling "on-edge" use). These factors make the system ideal for use in the field.
- Integrated device: NASA’s FPM is designed to enable integration into microfluidic devices for easy imaging of liquid samples.
Applications
- Quantitative phase imaging
- Biological imaging (cellular/subcellular imaging, tissue analysis, high-throughput flow cytometry)
- Digital pathology
- Drug discovery & development (study of drug delivery mechanisms at cellular level and compound microstructures)
- Aberration metrology
- Environmental monitoring
- Materials science & engineering
Technology Details
Optics
GSC-TOPS-364
GSC-18799-1
“Portable flow device using Fourier ptychography microscopy and deep learning for detection of biosignatures,” Nguyen, Thanh et al, 4/24/2020, https://ntrs.nasa.gov/citations/20205001364
Tags:
|
|
Related Links:
|
Similar Results

Digital Projection Focusing Schlieren System
NASAs digital projection focusing schlieren system is attached to a commercial-off-the-shelf camera. For focusing schlieren measurements, it directs light from the light source through a condenser lens and linear polarizer towards a beam-splitter where linear, vertically-polarized component of light is reflected onto the optical axis of the instrument. The light passes through the patterned LCD element, a polarizing prism, and a quarter-wave plate prior to projection from the assembly as left- or right-circularly polarized light. The grid-patterned light (having passed through the LCD element) is directed past the density object onto a retroreflective background (RBG) that serves as the source grid. Upon reflection off the RBG, the polarization state of light is mirrored. It passes the density object a second time and is then reimaged by the system. Upon encountering the polarizing prism the second time, the light is slightly offset. This refracted light passes through the LCD element, now serving as the cutoff grid, for a second time before being imaged by the camera.
The LCD element can be programmed to display a variety of grid patterns to enable sensitivity to different density gradients. the color properties of the LCD can be leveraged in combination with multiple colored light sources to enable simultaneous multi-color, multi-technique data collection.

Portable Microscope
The handheld digital microscope features a 3D-printed chassis to house its hardware, firmware, and rechargeable Li-ion battery with built-in power management. It incorporates an internal stainless-steel cage system to enclose and provide mechanical rigidity for the optics and imaging sensor. To reduce the microscope’s size, yet retain high spatial resolution, engineers devised an optical light path that uniquely folds back on itself using high reflectivity mirrors, thus significantly reducing internal volume.
Imaging control and acquisition is performed using a secure web-based graphical user interface accessible via any wireless enabled device. The microscope serves as its own wireless access point thus obviating the need for a pre-existing network. This web interface enables multiple simultaneous connections and facilitates data sharing with clinicians, scientists, or other personnel as needed. Acquired images can be stored locally on the microscope server or on a removable SD card. Data can be securely downloaded to other devices using a range of industry standard protocols.
Although the handheld digital microscope was originally developed for in-flight medical diagnosis in microgravity applications, prototypes were thoroughly ground-tested in a variety of environments to verify the accurate resolve of microbial samples for identification and compo-sitional analysis for terrestrial field use. Owing to its portability, other applications demanding rapid results may include research, education, veterinarian, military, contagion disaster response, telemedicine, and point-of-care medicine.

Biomarker Sensor Arrays for Microfluidics Applications
This invention provides a method and system for fabricating a biomarker sensor array by dispensing one or more entities using a precisely positioned, electrically biased nanoprobe immersed in a buffered fluid over a transparent substrate. Fine patterning of the substrate can be achieved by positioning and selectively biasing the probe in a particular region, changing the pH in a sharp, localized volume of fluid less than 100 nm in diameter, resulting in a selective processing of that region. One example of the implementation of this technique is related to Dip-Pen Nanolithography (DPN), where an Atomic Force Microscope probe can be used as a pen to write protein and DNA Aptamer inks on a transparent substrate functionalized with silane-based self-assembled monolayers. But it would be recognized that the invention has a much broader range of applicability. For example, the invention can be applied to formation of patterns using biological materials, chemical materials, metals, polymers, semiconductors, small molecules, organic and inorganic thins films, or any combination of these.

Fluid Lensing System for Imaging Underwater Environments
Fluid lensing exploits optofluidic lensing effects in two-fluid interfaces. When used in such interfaces, like air and water, coupled with computational imaging and a unique computer vision-processing pipeline, it not only removes strong optical distortions along the line of sight, but also significantly enhances the angular resolution and signal-to-noise ratio of an otherwise underpowered optical system. As high-frame-rate multi-spectral data are captured, fluid lensing software processes the data onboard and outputs a distortion-free 3D image of the benthic surface. This includes accounting for the way an object can look magnified or appear smaller than usual, depending on the shape of the wave passing over it, and for the increased brightness caused by caustics. By running complex calculations, the algorithm at the heart of fluid lensing technology is largely able to correct for these troublesome effects. The process introduces a fluid distortion characterization methodology, caustic bathymetry concepts, Fluid Lensing Lenslet Homography technique, and a 3D Airborne Fluid Lensing Algorithm as novel approaches for characterizing the aquatic surface wave field, modeling bathymetry using caustic phenomena, and robust high-resolution aquatic remote sensing. The formation of caustics by refractive lenslets is an important concept in the fluid lensing algorithm. The use of fluid lensing technology on drones is a novel means for 3D imaging of aquatic ecosystems from above the water's surface at the centimeter scale. Fluid lensing data are captured from low-altitude, cost-effective electric drones to achieve multi-spectral imagery and bathymetry models at the centimeter scale over regional areas. In addition, this breakthrough technology is developed for future in-space validation for remote sensing of shallow marine ecosystems from Low Earth Orbit (LEO). NASA's FluidCam instrument, developed for airborne and spaceborne remote sensing of aquatic systems, is a high-performance multi-spectral computational camera using Fluid lensing. The space-capable FluidCam instrument is robust and sturdy enough to collect data while mounted on an aircraft (including drones) over water.

Multi-Spectral Imaging Pyrometer
This NASA technology transforms a conventional infrared (IR) imaging system into a multi-wavelength imaging pyrometer using a tunable optical filter. The actively tunable optical filter is based on an exotic phase-change material (PCM) which exhibits a large reversible refractive index shift through an applied energetic stimulus. This change is non-volatile, and no additional energy is required to maintain its state once set. The filter is placed between the scene and the imaging sensor and switched between user selected center-wavelengths to create a series of single-wavelength, monochromatic, two-dimensional images. At the pixel level, the intensity values of these monochromatic images represent the wavelength-dependent, blackbody energy emitted by the object due to its temperature. Ratioing the measured spectral irradiance for each wavelength yields emissivity-independent temperature data at each pixel. The filter’s Center Wavelength (CWL) and Full Width Half Maximum (FWHM), which are related to the quality factor (Q) of the filter, are actively tunable on the order of nanoseconds-microseconds (GHz-MHz). This behavior is electronically controlled and can be operated time-sequentially (on a nanosecond time scale) in the control electronics, a capability not possible with conventional optical filtering technologies.