Automated Vision Test
The Wavefront Aberrations (WA) are a collection of different sorts of optical defects, including the familiar defocus and astigmatism that are corrected by eyeglasses, but also more complex higher order aberrations such as coma, spherical aberration, and others. The WA provide a comprehensive description of the optics of the eye, and thus determine the acuity. But until recently, a practical method of computing this relationship did not exist. Our solution to this problem is to simulate the observer performing the acuity task with an eye possessing a particular set of WA. When a letter is presented, we first distort a digital image of the letter by the specified WA, and add noise to mimic the noisiness of the visual system. From previous research, we have determined the appropriate noise level to match human performance. We then attempt to match the blurred noisy image to similarly blurred candidate letter images, and select the closest match. We repeat this for many trials at many letter sizes, and thereby determine the smallest letter than can be reliably identified: the visual acuity. We have streamlined and simplified the key steps for this simulation approach so that the entire process is robust, accurate, simple and fast. Results are typically obtained in a few seconds.
Variable Visibility Glasses for Instrument Flight Training
The technology combines electroactively controlled liquid crystal lenses with a means for determining the pilots head position. When the pilots head is positioned to look outside the front or side windscreens, the lenses restrict light transmission. When the pilots head is lowered to view the instrument panel or other cockpit displays, the lenses allow light transmission so that the view of the instruments is unimpeded. Light transmission through the lenses can be selectively controlled by the system, ranging from 0.1% to 10%. The lenses are mounted in conventional eyeglass frames. The frames include a detection system to determine the position and orientation of the pilots head. Circuits within the frames activate the lenses to restrict light transmission when the pilots head is oriented to look out the windscreen. A PC, linked to the aircraft flight computer or altimeter, is also in the control loop and turns off the system to allow unimpeded visibility when the aircraft is below 200 feet or for other specified conditions. The technology readiness level of this invention is at stage seven with a prototype having been tested.
High-Speed Smart Camera Detects Supersonic Inlet Shocks
In order for the camera to detect invisible air shocks in an aircraft engine's intake, a fine sheet of laser light is first projected through the airflow. The light is refracted in the densest part of the airflow (the location of the shock), which creates a dark spot that shows up as a dip or negative peak in the pixel intensity profile of the image. The smart camera uses this information to identify a negative going edge and a positive going edge, which is expressed as numeric pixel values within the linear array. Data is output from the circuit as an analog signal or digitally by an onboard microcontroller using a parallel digital bus or a serial interface such as the controller area network (CAN bus), Ethernet, RS-232/485 or USB. Unlike conventional edge detection systems, which rely on both a high-speed camera and a bulky computer or digital signal processor, this innovation uses an analog technique to process images. Its simple, sleek design consists of three basic parts: a linear image sensor, an analog signal processing circuit, and a digital circuit. The result is a smaller, more reliable technology with increased processing frame rates. The design can easily be tailored to the end use, and can be reconfigured to respond to positive and/or negative going edges. Furthermore, the threshold sensitivity can be varied and algorithmically set, making it well suited for a number of other terrestrial applications from transportation to manufacturing.
Multispectral Imaging, Detection, and Active Reflectance (MiDAR)
The MiDAR transmitter emits coded narrowband structured illumination to generate high-frame-rate multispectral video, perform real-time radiometric calibration, and provide a high-bandwidth simplex optical data-link under a range of ambient irradiance conditions, including darkness. A theoretical framework, based on unique color band signatures, is developed for multispectral video reconstruction and optical communications algorithms used on MiDAR transmitters and receivers. Experimental tests demonstrate a 7-channel MiDAR prototype consisting of an active array of multispectral high-intensity light-emitting diodes (MiDAR transmitter) coupled with a state-of-the-art, high-frame-rate NIR computational imager, the NASA FluidCam NIR, which functions as a MiDAR receiver. A 32-channel instrument is currently in development. Preliminary results confirm efficient, radiometrically-calibrated, high signal-to-noise ratio (SNR) active multispectral imaging in 7 channels from 405-940 nm at 2048x2048 pixels and 30 Hz. These results demonstrate a cost-effective and adaptive sensing modality, with the ability to change color bands and relative intensities in real-time, in response to changing science requirements or dynamic scenes. Potential applications of MiDAR include high-resolution nocturnal and diurnal multispectral imaging from air, space and underwater environments as well as long- distance optical communication, bidirectional reflectance distribution function characterization, mineral identification, atmospheric correction, UV/fluorescent imaging, 3D reconstruction using Structure from Motion (SfM), and underwater imaging using Fluid Lensing. Multipurpose sensors, such as MiDAR, which fuse active sensing and communications capabilities, may be particularly well-suited for mass-limited robotic exploration of Earth and the solar system and represent a possible new generation of instruments for active optical remote sensing.