Electric field distribution along JPLs lens-coupled dielectric waveguide. i
Lens-Coupled Dielectric Waveguides
Conventional interconnects consist of tapering the extremities of the dielectric waveguide that are inserted directly in the metallic waveguides, using long waveguide transition to reduce coupling loss (radiation at the dielectric-metallic interface). With JPL's novel interconnect solution, a lens couples the power from the metallic waveguide to the dielectric waveguide. (This lens can be fabricated inexpensively from the same dielectric material as the dielectric waveguide.) The ellipsoid geometry of the lens is designed to maximize the coupled power into the dielectric waveguide, resulting in only a small fraction of the coupled power radiating at the interface (14 to 20 dB). A small stepped impedance at the input of the lens and inserted in the waveguide provides a better matching impedance network at the discontinuity. Unlike conventional interconnects, the lens-coupled dielectric waveguide does not depend on physics contact; this improves reliability, reduces packaging complexity, and allows for added vibration/stress immunity.
Canary Islands
New Variables for Iterative Transform Phase Retrieval
The Discrete Fourier Transform (DFT) is the primary tool of digital signal processing. By defining new variables for iterative transform phase retrieval using a diagonal form of the DFT matrix, the phase retrieval problem appears to decouple in special cases that can increase the computational efficiency of phase retrieval. The current state of the art in increasing the DFT computational efficiency is the Fast Fourier Transform (FFT). In prior art, no acknowledgment of special purpose computational architectures are given, which are common to some digital signal processing applications. The new DFT method exploits a special computational approach based on analysis of the DFT as a transformation in a complex vector space. Traditional iterative transform techniques can be slow to converge; but in this new basis set, the algorithm can decouple to allow a closed form expression in special cases.
Automated Vision Test
Automated Vision Test
The Wavefront Aberrations (WA) are a collection of different sorts of optical defects, including the familiar defocus and astigmatism that are corrected by eyeglasses, but also more complex higher order aberrations such as coma, spherical aberration, and others. The WA provide a comprehensive description of the optics of the eye, and thus determine the acuity. But until recently, a practical method of computing this relationship did not exist. Our solution to this problem is to simulate the observer performing the acuity task with an eye possessing a particular set of WA. When a letter is presented, we first distort a digital image of the letter by the specified WA, and add noise to mimic the noisiness of the visual system. From previous research, we have determined the appropriate noise level to match human performance. We then attempt to match the blurred noisy image to similarly blurred candidate letter images, and select the closest match. We repeat this for many trials at many letter sizes, and thereby determine the smallest letter than can be reliably identified: the visual acuity. We have streamlined and simplified the key steps for this simulation approach so that the entire process is robust, accurate, simple and fast. Results are typically obtained in a few seconds.
NASAs Equatorial Vortex Experiment
Measuring Optical Null Lens Aberrations
This innovation uses image-based wavefront sensing or phase retrieval which uses light traveling through the optical system in generally the same way that it will in its as-used configuration. It estimates the wavefront of the optical system using images collected on a light detector (which can be the same detector used by the optical system in its intended use). This method uses several images taken through the null optic under test, with some aspect of the test setup being systematically varied from one image to the next. Usually this variation involves somewhat out-of-focus images that are taken systematically with different amounts of defocus. The light for each diversity-defocus image passes through the optical test setup and the null lens. The resulting irradiance of the transmitted beam is measured by a light detector and this image is saved on a computer where algorithms determine an estimate of the null lens's wavefront aberrations.
Flight Controls
Variable Visibility Glasses for Instrument Flight Training
The technology combines electroactively controlled liquid crystal lenses with a means for determining the pilots head position. When the pilots head is positioned to look outside the front or side windscreens, the lenses restrict light transmission. When the pilots head is lowered to view the instrument panel or other cockpit displays, the lenses allow light transmission so that the view of the instruments is unimpeded. Light transmission through the lenses can be selectively controlled by the system, ranging from 0.1% to 10%. The lenses are mounted in conventional eyeglass frames. The frames include a detection system to determine the position and orientation of the pilots head. Circuits within the frames activate the lenses to restrict light transmission when the pilots head is oriented to look out the windscreen. A PC, linked to the aircraft flight computer or altimeter, is also in the control loop and turns off the system to allow unimpeded visibility when the aircraft is below 200 feet or for other specified conditions. The technology readiness level of this invention is at stage seven with a prototype having been tested.
A Process for Producing High-Quality Lightweight Mirrors
A Process for Producing High-Quality Lightweight Mirrors
Each mirror is a monolithic structure consisting of a face sheet with a highly polished front optical surface. In Goddards process for making these mirrors, the optical surface is formed in a solid SCS blank either by conventional grinding and polishing or by diamond turning. The blank is then lightweighted using Computer Numerical Control (CNC) grinding. For critical applications, post-lightweighting polishing can be performed to further improve the optical surface. Due to the very small amount of material removed during this step, it produces no quilting or print-through of the lightweight support structure. At several points during the process, the mirror is heated to near its melting point to remove small crystalline defects caused by the fabrication process. Each resulting SCS mirror features a homogeneous composition free of internal stress. These parameters inhibit distortion when cooling the mirror to cryogenic temperatures. Under such conditions, the mirrors maintain their optical figure to 1/50th wave root mean square (RMS) or better. At room temperature, SCS has a thermal conductivity about the same as aluminum and a thermal coefficient of expansion about equal to Pyrex glass. So SCS mirrors are extremely resistant to thermal shock and ideal for applications where high heat dissipation is required. Virtually all conventional lightweight mirrors are made by optically grinding and polishing an already lightweighted blank. Mirrors made this way always risk print-through to the optical surface. In some cases this can be removed by a zero-pressure process, such as ion-beam polishing, although these processes tend to be slow and costly. Lightweighting after optical polishing is not an option for conventional materials as their inhomogeneous qualities and internal stresses cause the lightweighting to distort the optical surface. By contrast, in Goddards process for SCS mirrors, optical grinding and polishing is done before lightweighting, eliminating the possibility of print-through. This is due to the extreme homogeneity and absence of stress possible in a monolithic structure from a single crystal. This results in a simple and cost-effective process that is capable of producing mirrors of exceptional quality.
Fast Car
High-Speed Smart Camera Detects Supersonic Inlet Shocks
In order for the camera to detect invisible air shocks in an aircraft engine's intake, a fine sheet of laser light is first projected through the airflow. The light is refracted in the densest part of the airflow (the location of the shock), which creates a dark spot that shows up as a dip or negative peak in the pixel intensity profile of the image. The smart camera uses this information to identify a negative going edge and a positive going edge, which is expressed as numeric pixel values within the linear array. Data is output from the circuit as an analog signal or digitally by an onboard microcontroller using a parallel digital bus or a serial interface such as the controller area network (CAN bus), Ethernet, RS-232/485 or USB. Unlike conventional edge detection systems, which rely on both a high-speed camera and a bulky computer or digital signal processor, this innovation uses an analog technique to process images. Its simple, sleek design consists of three basic parts: a linear image sensor, an analog signal processing circuit, and a digital circuit. The result is a smaller, more reliable technology with increased processing frame rates. The design can easily be tailored to the end use, and can be reconfigured to respond to positive and/or negative going edges. Furthermore, the threshold sensitivity can be varied and algorithmically set, making it well suited for a number of other terrestrial applications from transportation to manufacturing.
Multispectral Imaging, Detection, and Active Reflectance (MiDAR)
The MiDAR transmitter emits coded narrowband structured illumination to generate high-frame-rate multispectral video, perform real-time radiometric calibration, and provide a high-bandwidth simplex optical data-link under a range of ambient irradiance conditions, including darkness. A theoretical framework, based on unique color band signatures, is developed for multispectral video reconstruction and optical communications algorithms used on MiDAR transmitters and receivers. Experimental tests demonstrate a 7-channel MiDAR prototype consisting of an active array of multispectral high-intensity light-emitting diodes (MiDAR transmitter) coupled with a state-of-the-art, high-frame-rate NIR computational imager, the NASA FluidCam NIR, which functions as a MiDAR receiver. A 32-channel instrument is currently in development. Preliminary results confirm efficient, radiometrically-calibrated, high signal-to-noise ratio (SNR) active multispectral imaging in 7 channels from 405-940 nm at 2048x2048 pixels and 30 Hz. These results demonstrate a cost-effective and adaptive sensing modality, with the ability to change color bands and relative intensities in real-time, in response to changing science requirements or dynamic scenes. Potential applications of MiDAR include high-resolution nocturnal and diurnal multispectral imaging from air, space and underwater environments as well as long- distance optical communication, bidirectional reflectance distribution function characterization, mineral identification, atmospheric correction, UV/fluorescent imaging, 3D reconstruction using Structure from Motion (SfM), and underwater imaging using Fluid Lensing. Multipurpose sensors, such as MiDAR, which fuse active sensing and communications capabilities, may be particularly well-suited for mass-limited robotic exploration of Earth and the solar system and represent a possible new generation of instruments for active optical remote sensing.
Stay up to date, follow NASA's Technology Transfer Program on:
facebook twitter linkedin youtube
Facebook Logo Twitter Logo Linkedin Logo Youtube Logo