Computer Vision Lends Precision to Robotic Grappling
information technology and software
Computer Vision Lends Precision to Robotic Grappling (MSC-TOPS-114)
AI analysis of live camera feed yields delta commands to human operator
Overview
Innovators at NASA Johnson Space Center (JSC) have developed computer vision software that derives target posture determinations quickly and then instructs an operator how to properly align a robotic end-effector with a target that they are trying to grapple. As an added benefit, the softwares object identification capability can also help detect physical defects on targets.
This technology was originally created to aid robotic arm operators aboard the International Space Station (ISS) that relied more heavily upon grappling instructional maneuvers derived from flight controllers on the ground at JSCs Mission Control Center (MCC). Despite the aid of computer-based models to predict the alignment of both robotic arm and target, iterative realignment procedures were often required to correct botched grapple operations, costing valuable time.
To solve this problem, NASAs computer vision software analyzes the live camera feed from the robotic arms single borescope camera and provides the operator with the delta commands required for an ideal grasp operation. This process is aided by a machine learning component that monitors the camera feed for any of the ISSs potential target fixtures. Once a target fixture is identified, proper camera and target parameters are automatically sequenced to prepare for grasping operations.
The Technology
The goal of this computer vision software is to take the guesswork out of grapple operations aboard the ISS by providing a robotic arm operator with real-time pose estimation of the grapple fixtures relative to the robotic arms end effectors. To solve this Perspective-n-Point challenge, the software uses computer vision algorithms to determine alignment solutions between the position of the camera eyepoint with the position of the end effector as the borescope camera sensors are typically located several centimeters from their respective end effector grasping mechanisms.
The software includes a machine learning component that uses a trained regional Convolutional Neural Network (r-CNN) to provide the capability to analyze a live camera feed to determine ISS fixture targets a robotic arm operator can interact with on orbit. This feature is intended to increase the grappling operational range of ISSs main robotic arm from a previous maximum of 0.5 meters for certain target types, to greater than 1.5 meters, while significantly reducing computation times for grasping operations.
Industrial automation and robotics applications that rely on computer vision solutions may find value in this softwares capabilities. A wide range of emerging terrestrial robotic applications, outside of controlled environments, may also find value in the dynamic object recognition and state determination capabilities of this technology as successfully demonstrated by NASA on-orbit.
This computer vision software is at a technology readiness level (TRL) 6, (system/sub-system model or prototype demonstration in an operational environment.), and the software is now available to license. Please note that NASA does not manufacture products itself for commercial sale.

![Shown: A target is acquired by the vision system. The parameters [X, Y, Z] (cm) are calculated and delta commands are fed to the human operator to move the robotic arm end-effector to grapple a moving target. Shown: A target is acquired by the vision system. The parameters [X, Y, Z] (cm) are calculated and delta commands are fed to the human operator to move the robotic arm end-effector to grapple a moving target.](https://technology.nasa.gov/t2media/tops/img/MSC-TOPS-114/bck.jpg)
Benefits
- Machine-guided efficiency: high fidelity position data allows human flight controllers to eliminate errors and save mission time.
- Trainable object recognition: a neural network capable of learning new targets automatically identifies objects as they enter a cameras field of view.
- Real-time tracking: recognized objects can be tracked, and positional data updated from frame to frame.
- Dynamic feedback: flight controllers are shown solution quality of target object position calculations.
Applications
- Orbital service: grasping operations for deployment or capture of spacecraft, satellites, and debris; robotic in-situ assembly
- Hazardous environments: robotic vision systems for mining, power generation, marine, reconnaissance and military
- Vehicle docking: mechanized capture, transport, and storage solutions
- Industrial automation: robotic assembly and manufacturing, inspection, automated local area transport
- Telesurgery: robotic vision systems to assist surgeons remotely in the operative field
Technology Details
information technology and software
MSC-TOPS-114
MSC-27184-1
Lucier et. al. International Space Station (ISS) Robotics Development Operations Team Results in Robotic Remote Sensing, Control, and Semi-Automated Ground Control Techniques. 16th International Conference on Space Operations, Cape Town, South Africa. May 3-5, 2021.
Similar Results
ARC ANGEL Reduces Gravity’s Effect on Arms
ARC ANGEL is an active robotic system like ARGOS; however, its electric motor is not mounted overhead to a runway and bridge system, but instead is mounted to the test subject’s backpack-like PLSS where the motor(s) supplies real-time actuation torque off-loading to the upper arms via cabling. If a test subject picks-up a hammer, the system will react immediately to offload the weight of the hammer relative to the programmed environment.
The ARC ANGEL system is comprised of an electric motor(s), soft goods, electronics hardware, firmware, and software. To provide a smoothly operating arm offloading analog and optimize system performance, engineers at JSC coded software that leverages kinematic algorithms and closed-loop architecture for motor control, along with custom computer language scripts to ingest sensor data. This allows ARC ANGEL’s subsystems to be seamlessly integrated and accurately simulate one to zero G environments.
During operation, compact tension sensors and inertial measurement units detect arm weight and motion and provide a closed-loop control system that feeds data to a single-board computer and requisite firmware for processing. A custom graphical user interface was also developed in-house to provide controls for inputting desired arm offload values. Additionally, ARC ANGEL features its own power supply that provides power to its subcomponents without external cables. This allows the ability to function independently from ARGOS and further lends itself to potential terrestrial applications.
This work directly correlates to active exosuit development that is being implemented for rehabilitation and/or assistive medical devices. ARC ANGEL is essentially providing a desired assistance (offload) while maintaining a subject’s full range of motion. The system hardware and software can be modified to custom-fit an individual without a spacesuit and toward limb-assisted movement – not just arm offloading. ARC ANGEL may already meet a higher physical demand and robustness given that it is engineered to perform in challenging environments with greater loads.
ARC ANGEL is at a technology readiness level (TRL) 5 (component and/or breadboard validation in laboratory environment) and is now available for patent licensing. Please note that NASA does not manufacture products itself for commercial sale.

FlashPose: Range and intensity image-based terrain and vehicle relative pose estimation algorithm
Flashpose is the combination of software written in C and FPGA firmware written in VHDL. It is designed to run under the Linux OS environment in an embedded system or within a custom development application on a Linux workstation. The algorithm is based on the classic Iterative Closest Point (ICP) algorithm originally proposed by Besl and McKay. Basically, the algorithm takes in a range image from a three-dimensional imager, filters and thresholds the image, and converts it to a point cloud in the Cartesian coordinate system. It then minimizes the distances between the point cloud and a model of the target at the origin of the Cartesian frame by manipulating point cloud rotation and translation. This procedure is repeated a number of times for a single image until a predefined mean square error metric is met; at this point the process repeats for a new image.
The rotation and translation operations performed on the point cloud represent an estimate of relative attitude and position, otherwise known as pose.
In addition to 6 degree of freedom (DOF) pose estimation, Flashpose also provides a range and bearing estimate relative to the sensor reference frame. This estimate is based on a simple algorithm that generates a configurable histogram of range information, and analyzes characteristics of the histogram to produce the range and bearing estimate. This can be generated quickly and provides valuable information for seeding the Flashpose ICP algorithm as well as external optical pose algorithms and relative attitude Kalman filters.

Visual Inspection Posable Invertebrate Robot (VIPIR)
Initially developed as a close quarters inspection tool capable of accessing hard to reach, tight, or visibly restricted, areas of satellites, the VIPIR system can be used to remotely inspect inaccessible locations such as behind a sheet of thermal blanketing material, into a satellites plumbing, or perhaps even deep inside the otherwise unreachable crevasses of a spacecraft bus. The VIPIR system incorporates a number of subassemblies for incredible operational freedom and capabilities for imaging and dissemination of componentry.
VIPIR’s Video Borescope Assembly (VBA) is a flexible snake-camera capable of multidirectional articulations, making steering and control simple and intuitive for an operator. The VBA also includes at least one imaging sensor and lighting to see in dark, confined spaces. Real-time, high-resolution visual information can be fed back to an operator for live analysis. A reel system extends and retracts the VBA with the use of a spool, and includes position indicators for deployment tracking. The Tendon Management System (TMS), not unlike human tendons, utilizes pulleys and tensioners to articulate the VBA in the confined spaces. A seal system ensures the VBA is free of contamination.
VIPIR underwent space-based testing on the ISS during the Robotic Refueling Phase 2 (RRM-2) mission designed to showcase and test several NASA advanced robotic satellite servicing technologies. During this mission, VIPIR demonstrated state-of-the-art near and midrange inspection capabilities. NASA’s VIPIR system is available for licensing to industry, and may be desirable to companies focused on satellite servicing, on-orbit assembly, and other applications requiring detailed inspection of assets in space.

Circumferential Scissor Spring Enhances Precision in Hand Controllers
The traditional scissor spring design for hand controllers has been improved upon with a circumferential spring controller mechanism that facilitates easy customization, enhanced durability, and optimum controller feedback. These advantages are partially facilitated by locating the spring to the outside of the mechanism which allows for easier spring replacement to adjust the deflection force or for maintenance.
The new mechanism is comprised of two rounded blades, or cams, that pivot forward and back under operation and meet to form a circle. An expansion spring is looped around the blade perimeter and resides in a channel, providing the restoring force that returns the control stick to a neutral position. Due to the use of a longer circumferential spring, the proportion of spring expansion is smaller for a given distance of deflection, so the forces associated with the deflection remain on a more linear portion of the force deflection curve.
The Circumferential Scissor Spring for Controllers is at technology readiness level (TRL) 8 (actual system completed and flight qualified through test and demonstration) and is available for patent licensing. Please note that NASA does not manufacture products itself for commercial sale.

Spacecraft to Remove Orbital Debris
An approach to mitigating the creation of additional orbital debris is to remove the sources of future medium debris by actively removing large spent objects from congested orbits. NASA has introduced the ADRV, an efficient and effective solution to remove large debris from LEO such as spent rocket bodies and non-functional satellites. The concept yields a single use, low-cost, lightweight, high mass fraction vehicle that enables the specific removal of large orbital debris (1000 - 4000 kg mass, 200 - 2000 km altitude, and 20 98-degree inclination). The ADRV performs rendezvous, approach, and capture of non-cooperative tumbling debris objects, maneuvering of the mated vehicle, and controlled, targeted reposition or deorbit of the mated vehicle. Due to its small form factor, up to eight ADRVs can be launched in a single payload, enabling high impact orbital debris removal missions within the same inclination group.
Three key technologies were developed to enable the ADRV: - 1) The spacecraft control system (SCS) is a guidance, navigation, and control system that provides vehicle control during all phases of a mission; - (2) The debris object characterization system (DOCS) characterizes movement and capture of non-cooperative targets; and - (3) The capture and release system (CARS) allows the vehicle to capture and mate with orbital debris targets. These technologies can improve the current state-of-the-art capabilities of automated rendezvous and docking technology significantly for debris objects with tumbling rates up to 25 degrees per second. This approach leverages decades of spaceflight experience while automating key mission areas to reduce cost and improve the likelihood of success.