Search

environment
Front image
Closed Ecological System Network Data Collection, Analysis, Control, and Optimization System
The technology relates generally to controlled ecosystems, and more particularly, to a Controlled Closed-Ecosystem Development System (CCEDS) that can be used to develop designs for sustainable, small-scale reproductions of subsets of the Earths biosphere and the Orbiting Modular Artificial-Gravity Spacecraft (OMAGS). The technology encompassing a CCEDS includes one or more a Closed Ecological Systems (CESs), each having one or more Controlled Ecosystem Modules (CESMs). Each CESM can have a biome containing at least one organism, and equipment comprising one or more of sensors, actuators, or components that are associated with the biome. A controller operates the equipment to effect transfer of material among CESMs to optimize one or more CESM biomes with respect to their organism population health, resilience, variety, quantities, biomass, and sustainability. A CES is a community of organisms and their resources that persist in a sealed volume such that mass is not added or removed. The mass (food/air/water) required by the CES organisms is continually recycled from the mass (waste) produced by the organisms. Energy and information may be transferred to and from a CES. CES research promises to become a significant resource for the resolution of global ecology problems which have thus far been experimentally inaccessible and may very well prove an invaluable resource for predicting the probable ecological consequences of anthropogenic materials on regional ecosystems. In order to create CESs that are orders of magnitude smaller than the Earth that can function without the Earth, the desired gravity level and necessary radiation shielding must be provided by other means. Orbiting Modular Artificial-Gravity Spacecraft (OMAGS) is a fractional gravity spacecraft design for CES payloads and is depicted in Figures below. In tandem, the CCEDS and OMAGS systems can be used to foster gravitational ecosystem research for developing sustainable communities in space and on Earth.
Electrical and Electronics
Highly secure all-printed Physically Unclonable Function (PUF) electronic device based on a nanomaterial network
The technology is an all-printed Physically Unclonable Function (PUF) electronic device based on a nanomaterial (such as single-walled carbon nanotube) network. The network may be a mixture of semiconducting and metallic nanotubes randomly tangled with each other through the printing process. The all-printed PUF electronic device comprises a nanomaterial ink that is inkjet deposited, dried, and randomly tangled on a substrate, creating a network. A plurality of electrode pairs is attached to the substrate around the substrate perimeter. Each nanotube in the network can be a conduction path between electrode pairs, with the resistance values varying among individual pairs and between networks due to inherent inter-device and intra-device variability. The unique resistance distribution pattern for each network may be visualized using a contour map based on the electrode information, providing a PUF key that is a 2D pattern of analog values. The PUF security keys remain stable and maintain robustness against security attacks. Although local resistance change may occur inside the network (e.g., due to environmental impact), such change has little effect on the overall pattern. In addition, when a network-wide resistance change occurs, all resistances are affected together, so that the unique pattern is maintained.
robotics automation and control
Flying drone
Airborne Machine Learning Estimates for Local Winds and Kinematics
The MAchine learning ESTimations for uRban Operations (MAESTRO) system is a novel approach that couples commodity sensors with advanced algorithms to provide real-time onboard local wind and kinematics estimations to a vehicle's guidance and navigation system. Sensors and computations are integrated in a novel way to predict local winds and promote safe operations in dynamic urban regions where Global Positioning System/Global Navigation Satellite System (GPS/GNSS) and other network communications may be unavailable or are difficult to obtain when surrounded by tall buildings due to multi-path reflections and signal diffusion. The system can be implemented onboard an Unmanned Aerial Systems (UAS) and once airborne, the system does not require communication with an external data source or the GPS/GNSS. Estimations of the local winds (speed and direction) are created using inputs from onboard sensors that scan the local building environment. This information can then be used by the onboard guidance and navigation system to determine safe and energy-efficient trajectories for operations in urban and suburban settings. The technology is robust to dynamic environments, input noise, missing data, and other uncertainties, and has been demonstrated successfully in lab experiments and computer simulations.
information technology and software
https://images.nasa.gov/details-iss062e000422
Computer Vision Lends Precision to Robotic Grappling
The goal of this computer vision software is to take the guesswork out of grapple operations aboard the ISS by providing a robotic arm operator with real-time pose estimation of the grapple fixtures relative to the robotic arms end effectors. To solve this Perspective-n-Point challenge, the software uses computer vision algorithms to determine alignment solutions between the position of the camera eyepoint with the position of the end effector as the borescope camera sensors are typically located several centimeters from their respective end effector grasping mechanisms. The software includes a machine learning component that uses a trained regional Convolutional Neural Network (r-CNN) to provide the capability to analyze a live camera feed to determine ISS fixture targets a robotic arm operator can interact with on orbit. This feature is intended to increase the grappling operational range of ISSs main robotic arm from a previous maximum of 0.5 meters for certain target types, to greater than 1.5 meters, while significantly reducing computation times for grasping operations. Industrial automation and robotics applications that rely on computer vision solutions may find value in this softwares capabilities. A wide range of emerging terrestrial robotic applications, outside of controlled environments, may also find value in the dynamic object recognition and state determination capabilities of this technology as successfully demonstrated by NASA on-orbit. This computer vision software is at a technology readiness level (TRL) 6, (system/sub-system model or prototype demonstration in an operational environment.), and the software is now available to license. Please note that NASA does not manufacture products itself for commercial sale.
Stay up to date, follow NASA's Technology Transfer Program on:
facebook twitter linkedin youtube
Facebook Logo Twitter Logo Linkedin Logo Youtube Logo