Search

optics
Nested Focusing Optics for Compact Neutron Sources
Nested Focusing Optics for Compact Neutron Sources
Conventional neutron beam experiments demand high fluxes that can only be obtained at research facilities equipped with a reactor source and neutron optics. However, access to these facilities is limited. The NASA technology uses grazing incidence reflective optics to produce focused beams of neutrons (Figure 1) from compact commercially available sources, resulting in higher flux concentrations. Neutrons are doubly reflected off of a parabolic and hyperbolic mirror at a sufficiently small angle, creating neutron beams that are convergent, divergent, or parallel. Neutron flux can be increased by concentrically nesting mirrors with the same focal length and curvature, resulting in a convergence of multiple neutron beams at a single focal point. The improved flux from the compact source may be used for non-destructive testing, imaging, and materials analysis. The grazing incidence neutron optic mirrors are fabricated using an electroformed nickel replication technique developed by NASA and the Harvard-Smithsonian Center for Astrophysics (Figure 2). A machined aluminum mandrel is super-polished to a surface roughness of 3-4 angstroms root mean square and plated with layers of highly reflective nickel-cobalt alloy. Residual stresses that can cause mirror warping are eliminated by periodically reversing the anode and cathode polarity of the electroplating system, resulting in a deformation-free surface. The fabrication process has been used to produce 0.5 meter and 1.0 meter lenses.
sensors
Source Unsplash (free/unlimited use)
Multi-Spectral Imaging Pyrometer
This NASA technology transforms a conventional infrared (IR) imaging system into a multi-wavelength imaging pyrometer using a tunable optical filter. The actively tunable optical filter is based on an exotic phase-change material (PCM) which exhibits a large reversible refractive index shift through an applied energetic stimulus. This change is non-volatile, and no additional energy is required to maintain its state once set. The filter is placed between the scene and the imaging sensor and switched between user selected center-wavelengths to create a series of single-wavelength, monochromatic, two-dimensional images. At the pixel level, the intensity values of these monochromatic images represent the wavelength-dependent, blackbody energy emitted by the object due to its temperature. Ratioing the measured spectral irradiance for each wavelength yields emissivity-independent temperature data at each pixel. The filters Center Wavelength (CWL) and Full Width Half Maximum (FWHM), which are related to the quality factor (Q) of the filter, are actively tunable on the order of nanoseconds-microseconds (GHz-MHz). This behavior is electronically controlled and can be operated time-sequentially (on a nanosecond time scale) in the control electronics, a capability not possible with conventional optical filtering technologies.
Optics
Ruggedized Infrared Camera
This new technology applies NASA engineering to a FLIR Systems Boson® Model No. 640 to enable a robust IR camera for use in space and other extreme applications. Enhancements to the standard Boson® platform include a ruggedized housing, connector, and interface. The Boson® is a COTS small, uncooled, IR camera based on microbolometer technology and operates in the long-wave infrared (LWIR) portion of the IR spectrum. It is available with several lens configurations. NASA's modifications allow the IR camera to survive launch conditions and improve heat removal for space-based (vacuum) operation. The design includes a custom housing to secure the camera core along with a lens clamp to maintain a tight lens-core connection during high vibration launch conditions. The housing also provides additional conductive cooling for the camera components allowing operation in a vacuum environment. A custom printed circuit board (PCB) in the housing allows for a USB connection using a military standard (MIL-STD) miniaturized locking connector instead of the standard USB type C connector. The system maintains the USB standard protocol for easy compatibility and "plug-and-play" operation.
instrumentation
Source is Free NASA Image library
Projected Background-Oriented Schlieren Imaging
The Projected BOS imaging system developed at the NASA Langley Research Center provides a significant advancement over other BOS flow visualization techniques. Specifically, the present BOS imaging method removes the need for a physically patterned retroreflective background within the flow of interest and is therefore insensitive to the changing conditions due to the flow. For example, in a wind tunnel used for aerodynamics testing, there are vibrations and temperature changes that can affect the entire tunnel and anything inside it. Any patterned background within the wind tunnel will be subject to these changing conditions and those effects must be accounted for in the post-processing of the BOS image. This post-processing is not necessary in the Projected BOS process here. In the Projected BOS system, a pattern is projected onto a retroreflective background across the flow of interest (Figure 1). The imaged pattern in this configuration can be made physically (a pattern on a transparent slide) or can be digitally produced on an LCD screen. In this projection scheme, a reference image can be taken at the same time as the signal image, facilitating real-time BOS imaging and the pattern to be changed or optimized during the measurements. Thus far, the Projected BOS imaging technology has been proven to work by visualizing the air flow out of a compressed air canister taken with this new system (Figure 2).
optics
Image from internal NASA presentation developed by inventor and dated May 4, 2020.
Reflection-Reducing Imaging System for Machine Vision Applications
NASAs imaging system is comprised of a small CMOS camera fitted with a C-mount lens affixed to a 3D-printed mount. Light from the high-intensity LED is passed through a lens that both diffuses and collimates the LED output, and this light is coupled onto the cameras optical axis using a 50:50 beam-splitting prism. Use of the collimating/diffusing lens to condition the LED output provides for an illumination source that is of similar diameter to the cameras imaging lens. This is the feature that reduces or eliminates shadows that would otherwise be projected onto the subject plane as a result of refractive index variations in the imaged volume. By coupling the light from the LED unit onto the cameras optical axis, reflections from windows which are often present in wind tunnel facilities to allow for direct views of a test section can be minimized or eliminated when the camera is placed at a small angle of incidence relative to the windows surface. This effect is demonstrated in the image on the bottom left of the page. Eight imaging systems were fabricated and used for capturing background oriented schlieren (BOS) measurements of flow from a heat gun in the 11-by-11-foot test section of the NASA Ames Unitary Plan Wind Tunnel (see test setup on right). Two additional camera systems (not pictured) captured photogrammetry measurements.
instrumentation
Assembly for Simplified Hi-Res Flow Visualization
NASAs single grid, self-aligned focusing schlieren optical assembly is attached to a commercial-off-the-shelf camera. It directs light from the light source through a condenser lens and linear polarizer towards a polarizing beam-splitter where the linear, vertically-polarized component of light is reflected onto the optical axis of the instrument. The light passes through a Ronchi ruling grid, a polarizing prism, and a quarter-wave plate prior to projection from the assembly as right-circularly polarized light. The grid-patterned light (having passed through the Ronchi grid) is directed past the density object onto a retroreflective background that serves as the source grid. Upon reflection off the retroreflective background, the polarization state of light is mirrored. It passes the density object a second time and is then reimaged by the system. Upon encountering the polarizing prism the second time, the light is refracted resulting in a slight offset. This refracted light passes through the Ronchi ruling grid, now serving as the cutoff grid, for a second time before being imaged by the camera. Both small- and large-scale experimental set ups have been evaluated and shown to be capable of fields-of-view of 10 and 300 millimeters respectively. Observed depths of field were found to be comparable to existing systems. Light sources, polarizing prisms, retroreflective materials and lenses can be customized to suit a particular experiment. For example, with a high speed camera and laser light source, the system has collected flow images at a rate of 1MHz.
Optics
Image from NTRS publication
Fingerprinting for Rapid Battery Inspection
The technology utilizes photopolymer droplets (invisible to the digital radiograph) with embedded radiopaque fragments to create randomized fingerprints on battery samples. The droplets are deposited using a jig (see figure on right) that precisely positions samples. Then, at different points during battery R&D testing or use, digital radiography imaging with micron-level resolution can be performed. The high-resolution imaging required to detect dendrite formation requires images to be collected in multiple “tiles” as shown below. The randomized fingerprints uniquely identify relative positioning of these tiles, allowing rapid assembly of composite high-resolution images from multiple tiles. This same composite creation process can be used for images taken at a series of points in time during testing, and background subtraction can be applied to efficiently compare how the battery is changing over successive charge/discharge cycles to identify dendrite formation. This inspection technique is proven effective for thin-film pouch cell prototypes at NASA, and it works well at the lowest available x-ray energy level (limiting impact on the samples). The Fingerprinting for Rapid Battery Inspection technology is available for patent licensing.
optics
A Porites coral imaged at the air-water interface that causes fluid lensing. From 2013 American Samoa field campaign, Dr. Ved Chirayath
Fluid Lensing System for Imaging Underwater Environments
Fluid lensing exploits optofluidic lensing effects in two-fluid interfaces. When used in such interfaces, like air and water, coupled with computational imaging and a unique computer vision-processing pipeline, it not only removes strong optical distortions along the line of sight, but also significantly enhances the angular resolution and signal-to-noise ratio of an otherwise underpowered optical system. As high-frame-rate multi-spectral data are captured, fluid lensing software processes the data onboard and outputs a distortion-free 3D image of the benthic surface. This includes accounting for the way an object can look magnified or appear smaller than usual, depending on the shape of the wave passing over it, and for the increased brightness caused by caustics. By running complex calculations, the algorithm at the heart of fluid lensing technology is largely able to correct for these troublesome effects. The process introduces a fluid distortion characterization methodology, caustic bathymetry concepts, Fluid Lensing Lenslet Homography technique, and a 3D Airborne Fluid Lensing Algorithm as novel approaches for characterizing the aquatic surface wave field, modeling bathymetry using caustic phenomena, and robust high-resolution aquatic remote sensing. The formation of caustics by refractive lenslets is an important concept in the fluid lensing algorithm. The use of fluid lensing technology on drones is a novel means for 3D imaging of aquatic ecosystems from above the water's surface at the centimeter scale. Fluid lensing data are captured from low-altitude, cost-effective electric drones to achieve multi-spectral imagery and bathymetry models at the centimeter scale over regional areas. In addition, this breakthrough technology is developed for future in-space validation for remote sensing of shallow marine ecosystems from Low Earth Orbit (LEO). NASA's FluidCam instrument, developed for airborne and spaceborne remote sensing of aquatic systems, is a high-performance multi-spectral computational camera using Fluid lensing. The space-capable FluidCam instrument is robust and sturdy enough to collect data while mounted on an aircraft (including drones) over water.
Stay up to date, follow NASA's Technology Transfer Program on:
facebook twitter linkedin youtube
Facebook Logo Twitter Logo Linkedin Logo Youtube Logo