Search

optics
Automated Vision Test
Automated Vision Test
The Wavefront Aberrations (WA) are a collection of different sorts of optical defects, including the familiar defocus and astigmatism that are corrected by eyeglasses, but also more complex higher order aberrations such as coma, spherical aberration, and others. The WA provide a comprehensive description of the optics of the eye, and thus determine the acuity. But until recently, a practical method of computing this relationship did not exist. Our solution to this problem is to simulate the observer performing the acuity task with an eye possessing a particular set of WA. When a letter is presented, we first distort a digital image of the letter by the specified WA, and add noise to mimic the noisiness of the visual system. From previous research, we have determined the appropriate noise level to match human performance. We then attempt to match the blurred noisy image to similarly blurred candidate letter images, and select the closest match. We repeat this for many trials at many letter sizes, and thereby determine the smallest letter than can be reliably identified: the visual acuity. We have streamlined and simplified the key steps for this simulation approach so that the entire process is robust, accurate, simple and fast. Results are typically obtained in a few seconds.
instrumentation
Flight Controls
Variable Visibility Glasses for Instrument Flight Training
The technology combines electroactively controlled liquid crystal lenses with a means for determining the pilots head position. When the pilots head is positioned to look outside the front or side windscreens, the lenses restrict light transmission. When the pilots head is lowered to view the instrument panel or other cockpit displays, the lenses allow light transmission so that the view of the instruments is unimpeded. Light transmission through the lenses can be selectively controlled by the system, ranging from 0.1% to 10%. The lenses are mounted in conventional eyeglass frames. The frames include a detection system to determine the position and orientation of the pilots head. Circuits within the frames activate the lenses to restrict light transmission when the pilots head is oriented to look out the windscreen. A PC, linked to the aircraft flight computer or altimeter, is also in the control loop and turns off the system to allow unimpeded visibility when the aircraft is below 200 feet or for other specified conditions. The technology readiness level of this invention is at stage seven with a prototype having been tested.
optics
MIDAR
Multispectral Imaging, Detection, and Active Reflectance (MiDAR)
The MiDAR transmitter emits coded narrowband structured illumination to generate high-frame-rate multispectral video, perform real-time radiometric calibration, and provide a high-bandwidth simplex optical data-link under a range of ambient irradiance conditions, including darkness. A theoretical framework, based on unique color band signatures, is developed for multispectral video reconstruction and optical communications algorithms used on MiDAR transmitters and receivers. Experimental tests demonstrate a 7-channel MiDAR prototype consisting of an active array of multispectral high-intensity light-emitting diodes (MiDAR transmitter) coupled with a state-of-the-art, high-frame-rate NIR computational imager, the NASA FluidCam NIR, which functions as a MiDAR receiver. A 32-channel instrument is currently in development. Preliminary results confirm efficient, radiometrically-calibrated, high signal-to-noise ratio (SNR) active multispectral imaging in 7 channels from 405-940 nm at 2048x2048 pixels and 30 Hz. These results demonstrate a cost-effective and adaptive sensing modality, with the ability to change color bands and relative intensities in real-time, in response to changing science requirements or dynamic scenes. Potential applications of MiDAR include high-resolution nocturnal and diurnal multispectral imaging from air, space and underwater environments as well as long- distance optical communication, bidirectional reflectance distribution function characterization, mineral identification, atmospheric correction, UV/fluorescent imaging, 3D reconstruction using Structure from Motion (SfM), and underwater imaging using Fluid Lensing. Multipurpose sensors, such as MiDAR, which fuse active sensing and communications capabilities, may be particularly well-suited for mass-limited robotic exploration of Earth and the solar system and represent a possible new generation of instruments for active optical remote sensing.
Stay up to date, follow NASA's Technology Transfer Program on:
facebook twitter linkedin youtube
Facebook Logo Twitter Logo Linkedin Logo Youtube Logo