Search

optics
Automated Vision Test
Automated Vision Test
The Wavefront Aberrations (WA) are a collection of different sorts of optical defects, including the familiar defocus and astigmatism that are corrected by eyeglasses, but also more complex higher order aberrations such as coma, spherical aberration, and others. The WA provide a comprehensive description of the optics of the eye, and thus determine the acuity. But until recently, a practical method of computing this relationship did not exist. Our solution to this problem is to simulate the observer performing the acuity task with an eye possessing a particular set of WA. When a letter is presented, we first distort a digital image of the letter by the specified WA, and add noise to mimic the noisiness of the visual system. From previous research, we have determined the appropriate noise level to match human performance. We then attempt to match the blurred noisy image to similarly blurred candidate letter images, and select the closest match. We repeat this for many trials at many letter sizes, and thereby determine the smallest letter than can be reliably identified: the visual acuity. We have streamlined and simplified the key steps for this simulation approach so that the entire process is robust, accurate, simple and fast. Results are typically obtained in a few seconds.
information technology and software
NASA's Hubble Space Telescope has revisited the famous Pillars of Creation, revealing a sharper and wider view of the structures in this visible-light image.
Autonomic Autopoiesis
Highly distributed next-generation computer-based systems require self-managing environments that feature a range of autonomic computing techniques. This functionality is provided by collaborating agents, and includes an apoptotic (self-destruct) mechanism, autonomic quiescence (self-sleep), and others. The apoptotic feature is necessary to maintain system security and integrity when a component endangers the overall operation and viability of the entire system. However, the self-destruction of an agent/component may remove a key piece of functionality. The novel autopoietic functionality provides the capability to duplicate or substitute a new agent that provides the functionality of the self-destructed component.
electrical and electronics
SpaceCube 1.0b
SpaceCube
Next generation instruments are capable of producing data at rates of 108 to 1011 bits per second, and both their instrument designs and mission operations concepts are severely constrained by data rate/volume. SpaceCube is an enabling technology for these next generation missions. SpaceCube has demonstrated enabling capabilities in Earth Science, Planetary, Satellite Servicing, Astrophysics and Heliophysics prototype applications such as on-board product generation, intelligent data volume reduction, autonomous docking/landing, direct broadcast products, and data driven processing with the ability to autonomously detect and react to events. SpaceCube systems are currently being developed and proposed for platforms from small CubeSats to larger scale experiments on the ISS and standalone free-flyer missions, and are an ideal fit for cost constrained next generation applications due to the tremendous flexibility (both functional and interface compatibility) provided by the SpaceCube system.
electrical and electronics
Smart Car Navigation
Sampling and Control Circuit Board
For fast platform dynamics, it is necessary to sample the IMU at quick intervals in order to fulfill the Nyquist sampling theorem requirements. This can be difficult in cases where low size, weight, and power are required, since a primary processor may already be saturated running the navigation algorithm or other system functions. Glenn's novel circuit board was designed to handle the sampling process (involving frequent interrupt requests) in parallel, while delivering the resulting data to a buffered communication port for inclusion in the navigation algorithm on an as-available basis. The circuit operates using a universal serial bus (USB) or Bluetooth interface. A control command is sent to the circuit from a separate processor or computer that instructs the circuit how to sample data. Then, a one-pulse-per-second signal from a GPS receiver or other reliable time source is sent to trigger the circuit to perform automatic data collection from the IMU sensor. This is an early-stage technology requiring additional development. Glenn welcomes co-development opportunities.
aerospace
Inflight Global Nonlinear Aerodynamics Modeling and Simulation
Inflight Global Nonlinear Aerodynamics Modeling and Simulation
This technology, the first phase in a suite of technologies designed to control autonomous (unmanned) vehicles, is a method of developing a model that characterizes the aerodynamics and/or thrust of an airplane or spacecraft. The model can be developed and flight validated in as little as a single flight by tracking data relevant to the aircraft response to controls and external forces acting on the vehicle. Current state of the art requires that repeated test flights be made to gather data from different flight conditions, followed by evaluation and analysis on the ground, and repeated over a number of flights, to eventually yield a combined linear, approximate model. This technology, however, can accomplish the development of a validated high-fidelity, global, nonlinear model in as little as a single flight. The technology is part of the Learn to Fly project to develop real-time models and controllers for autonomous aircraft and spacecraft. For application in piloted aircraft, the system can provide alerts in dangerous conditions, such as tail icing, through detecting abnormal aircraft responses. Potential applications include not only aircraft but also spacecraft or even marine craft, and self-driving cars and trucks.
environment
Robonaut 2: Hazardous Environments
Robonaut 2: Hazardous Environments
Robonaut 2 (R2) has the capability of functioning autonomously or it can be controlled by direct teleoperations, which is advantageous for hazardous environments. When functioning autonomously, R2 understands what to do and how to do it based on sensory input. R2's torso holds the control system while the visor holds several cameras that are incorporated into the visual perception system. With these capabilities, R2 can reduce or eliminate the need for humans to be exposed to dangerous environments. R2 also has a very rugged four-wheel base called the Centaur 2. The Centaur 2 base can lower or raise itself to and from the ground and turn its wheels in any direction, allowing it to turn in place and drive forward or sideways. This enables the R2 to enter hazardous areas or tackle difficult terrain without endangering its human operator. Robonaut 2 as a whole, or some of its components, can be an invaluable tool for land mine detection, bomb disposal, search and rescue, waste recycling, medical quarantined area, and so much more. The suite of technologies provides an ability to manipulate tools to help with a task, or it can tackle many tasks in a row, where a standard robot may not have the dexterity or sensing capability to get the job done. R2 could pick through nuclear waste, measure toxicity levels, and survey areas too remote or dangerous for human inspection. R2 could deal with improvised explosive devices, detect and dispose of bombs or landmines, and operate equipment that can break through walls or doors.
instrumentation
Powder Handling Device for Analytical Instruments
Powder Handling Device for Analytical Instruments
This invention is a system and associated method for causing a fine-grained powder in a sample holder to undergo at least one of three motions (vibration, rotation or translation) at a selected motion frequency in order to expose a statistically relevant population of grains in random orientation to a diffraction or fluorescent source. One or more measurements of diffraction, fluorescence, spectroscopic interaction, transmission, absorption and/or reflection can be made on the sample, using x-rays or light in a selected wavelength region. In one embodiment, the invention allows the relaxation of sample preparation and handling requirements for powder X-ray Diffraction (pXRD). The sample, held between two thin plastic windows, undergoes granular convection similar to a heated liquid, causing the individual grains to move past a collimated X-ray beam in random orientation over time. The result is an X-ray diffraction pattern having the correct diffracted intensities without a requirement for specialized mechanical motions. A major improvement over conventional sample preparation and handling techniques for pXRD is the potential to characterize larger grain-size material, resulting in a significant relaxation of the constraints on sample preparation (grinding). The powder handling system as described extends the range of useful grain sizes for XRD/ X-ray fluorescence (XRF) from a few micrometers (m) to several hundred m. Inclusion of the powder handling system enables automated instruments such as CheMin, a robotic XRD/XRF instrument designed and developed by NASA, to analyze as-received or coarsely powdered samples on NASAs Mars Science Laboratory rover, or in extreme, toxic or hazardous environments on Earth.
manufacturing
Robonaut 2 is a dexterous robot able to work with tools and equipment designed for human use.
Robonaut 2: Industrial Opportunities
NASA, GM, and Oceaneering approached the development of R2 from a dual use environment for both space and terrestrial application. NASA needed an astronaut assistant able to function in space and GM was looking for a robot that could function in an industrial setting. With this in mind, R2 was made with many capabilities that offer an enormous advantage in industrial environments. For example, the robot has the ability to retool and vary its tasks. Rather than a product moving from station to station on a conveyor with dozens of specialized robots performing unique tasks, R2 can handle several assembly steps at a single station, thereby reducing manufacturing floor space requirements and the need for multiple robots for the same activities. The robot can also be used in scenarios where dangerous chemicals, biological, or even nuclear materials are part of the manufacturing process. R2 uses stereovision to locate human teammates or tools and a navigation system. The robot was also designed with special torsional springs and position feedback to control fine motor movements in the hands and arms. R2's hands and arms sense weight and pressure and stop when they come in contact with someone or something. These force sensing capabilities make R2 safe to work side-by-side with people on an assembly line, assisting them in ergonomically challenging tasks or working independently. This NASA Technology is available for your company to license and develop into a commercial product. NASA does not manufacture products for commercial sale.
manufacturing
front image
Interim, In Situ Additive Manufacturing Inspection
The in situ inspection technology for additive manufacturing combines different types of cameras strategically placed around the part to monitor its properties during construction. The IR cameras collect accurate temperature data to validate thermal math models, while the visual cameras obtain highly detailed data at the exact location of the laser to build accurate, as-built geometric models. Furthermore, certain adopted techniques (e.g., single to grouped pixels comparison to avoid bad/biased pixels) reduce false positive readings. NASA has developed and tested prototypes in both laser-sintered plastic and metal processes. The technology detected errors due to stray powder sparking and material layer lifts. Furthermore, the technology has the potential to detect anomalies in the property profile that are caused by errors due to stress, power density issues, incomplete melting, voids, incomplete fill, and layer lift-up. Three-dimensional models of the printed parts were reconstructed using only the collected data, which demonstrates the success and potential of the technology to provide a deeper understanding of the laser-metal interactions. By monitoring the print, layer by layer, in real-time, users can pause the process and make corrections to the build as needed, reducing material, energy, and time wasted in nonconforming parts.
sensors
NASA robotic vehicle prototype
Super Resolution 3D Flash LIDAR
This suite of technologies includes a method, algorithms, and computer processing techniques to provide for image photometric correction and resolution enhancement at video rates (30 frames per second). This 3D (2D spatial and range) resolution enhancement uses the spatial and range information contained in each image frame, in conjunction with a sequence of overlapping or persistent images, to simultaneously enhance the spatial resolution and range and photometric accuracies. In other words, the technologies allows for generating an elevation (3D) map of a targeted area (e.g., terrain) with much enhanced resolution by blending consecutive camera image frames. The degree of image resolution enhancement increases with the number of acquired frames.
Stay up to date, follow NASA's Technology Transfer Program on:
facebook twitter linkedin youtube
Facebook Logo Twitter Logo Linkedin Logo Youtube Logo