Vision-based Approach and Landing System (VALS)

Aerospace
Vision-based Approach and Landing System (VALS) (TOP2-322)
Alternative Position, Navigation, and Timing (APNT) solution for Advanced Air Mobility aircraft in environments where GPS is not available
Overview
There are certain situations when landing an Advanced Air Mobility (AAM) aircraft is required to be performed without assistance from GPS data. For example, AAM aircraft flying in an urban environment with tall buildings and narrow canyons may affect the ability of the AAM aircraft to effectively use GPS to access a landing area. Incorporating a vision-based navigation method, NASA Ames has developed a novel Alternative Position, Navigation, and Timing (APNT) solution for AAM aircraft in environments where GPS is not available. The technology employs an aircraft-mounted camera to visually detect known features in the landing area, using a computer vision method called Coplanar Pose from Orthography and Scaling with Iterations (COPOSIT) to determine the position and orientation of the aircraft relative to the features detected from the camera images, and an extended Kalman filter to predict and correct the estimated position, velocity, and orientation of the aircraft.

The Technology
The novel Vision-based Approach and Landing System (VALS) provides Advanced Air Mobility (AAM) aircraft with an Alternative Position, Navigation, and Timing (APNT) solution for approach and landing without relying on GPS. VALS operates on multiple images obtained by the aircraft’s video camera as the aircraft performs its descent. In this system, a feature detection technique such as Hough circles and Harris corner detection is used to detect which portions of the image may have landmark features. These image areas are compared with a stored list of known landmarks to determine which features correspond to the known landmarks. The world coordinates of the best matched image landmarks are inputted into a Coplanar Pose from Orthography and Scaling with Iterations (COPOSIT) module to estimate the camera position relative to the landmark points, which yields an estimate of the position and orientation of the aircraft. The estimated aircraft position and orientation are fed into an extended Kalman filter to further refine the estimation of aircraft position, velocity, and orientation. Thus, the aircraft’s position, velocity, and orientation are determined without the use of GPS data or signals. Future work includes feeding the vision-based navigation data into the aircraft’s flight control system to facilitate aircraft landing.
AAM TOP: Vision-based Approach and Landing System (VALS) Block Diagram
Bottom: An overview of an environment in which an AAM aircraft might operate
Benefits
  • Precision landing solution in GPS-denied or complex environments: replacement or backup to GPS-based or ground-based visual navigation in automated approach and landing systems using onboard vision and sensors
  • Enhanced safety: provides AAM aircraft with an Alternative Position, Navigation, and Timing (APNT) solution for approach and landing without relying on GPS
  • Precise position, velocity, and orientation of the aircraft is determined without access to GPS compared to conventional systems
  • Near real-time operation: applies the computer vision method Coplanar Pose from Orthography and Scaling with Iterations (COPOSIT) with feature detection methods
  • Combining IMU (inertial measurement unit) with vision creates a sensor fusion navigation solution for GPS-denied environments

Applications
  • Advanced Air Mobility (AAM)
  • Urban Air Mobility (UAM) aircraft (e.g., drones and other unmanned aerial vehicles (UAVs), and electric vertical takeoff and landing (eVTOL) aircraft)
  • Air-Taxi industry
  • Commercial aircraft with downward-facing cameras may also apply this approach and incorporate a landing system based on the landing lights or fiducials on the runway
  • Airports, heliports, and vertiports with landmarks or landing lights or fiducials can leverage the technology to assist incoming aircraft during approach and landing
Technology Details

Aerospace
TOP2-322
ARC-18891-1
Simulated Vision-based Approach and Landing System for Advanced Air Mobility https://arc.aiaa.org/doi/abs/10.2514/6.2023-2195 https://ntrs.nasa.gov/api/citations/20220017713/downloads/main%202023%20AIAA%20DS%20r10ek.pdf https://ntrs.nasa.gov/api/citations/20220018454/downloads/2023%20AIAA%20Scitech%20VALS%20Presentation%20r5ek.pdf VSLAM and Vision-based Approach and Landing for Advanced Air Mobility https://arc.aiaa.org/doi/abs/10.2514/6.2023-2196 https://ntrs.nasa.gov/api/citations/20220017714/downloads/main%20VSLAM%20r12ek.pdf https://ntrs.nasa.gov/api/citations/20220018455/downloads/2023%20AIAA%20Scitech%20VSLAM%20VAL%20Presentation%20r5ek.pdf Vision-Based Precision Approach and Landing for Advanced Air Mobility https://arc.aiaa.org/doi/abs/10.2514/6.2022-0497 https://ntrs.nasa.gov/api/citations/20210025114/downloads/main%20r13_compressed.pdf https://ntrs.nasa.gov/api/citations/20210025280/downloads/2022%20AIAA%20Scitech%20Presentation%20r3ek.pdf
Similar Results
Flying drone
Airborne Machine Learning Estimates for Local Winds and Kinematics
The MAchine learning ESTimations for uRban Operations (MAESTRO) system is a novel approach that couples commodity sensors with advanced algorithms to provide real-time onboard local wind and kinematics estimations to a vehicle's guidance and navigation system. Sensors and computations are integrated in a novel way to predict local winds and promote safe operations in dynamic urban regions where Global Positioning System/Global Navigation Satellite System (GPS/GNSS) and other network communications may be unavailable or are difficult to obtain when surrounded by tall buildings due to multi-path reflections and signal diffusion. The system can be implemented onboard an Unmanned Aerial Systems (UAS) and once airborne, the system does not require communication with an external data source or the GPS/GNSS. Estimations of the local winds (speed and direction) are created using inputs from onboard sensors that scan the local building environment. This information can then be used by the onboard guidance and navigation system to determine safe and energy-efficient trajectories for operations in urban and suburban settings. The technology is robust to dynamic environments, input noise, missing data, and other uncertainties, and has been demonstrated successfully in lab experiments and computer simulations.
Powerline Geolocation
The electrical transmission lines used to transmit power are optimized for 50 to 60 Hz waveforms, but are suitable for waveforms of higher frequencies, into the megahertz range. Therefore, powerline conductors are capable of transmitting signals which could be used for geolocation. Indeed, frequencies in the 100 kHz range are used for diagnostic purposes in contemporary power grids today. Signals transmitted in this band are used to verify the operation of sites along the power grid, are used to configure the power grid (for example, throw a circuit breaker). The technology takes advantage of the suitability of electrical conductors employed in power transmission for signal transmission in the sub-megahertz range, and, in the preferred embodiment, utilizes the existing diagnostics signals in this range for geolocation.
NASA UAV
Low Weight Flight Controller Design
Increasing demand for smaller UAVs (e.g., sometimes with wingspans on the order of six inches and weighing less than one pound) generated a need for much smaller flight and sensing equipment. NASA Langley's new sensing and flight control system for small UAVs includes both an active flight control board and an avionics sensor board. Together, these compare the status of the UAVs position, heading, and orientation with the pre-programmed data to determine and apply the flight control inputs needed to maintain the desired course. To satisfy the small form-factor system requirements, micro-electro-mechanical systems (MEMS) are used to realize the various flight control sensing devices. MEMS-based devices are commercially available single-chip devices that lend themselves to easy integration onto a circuit board. The system uses less energy than current systems, allowing solar panels planted on the vehicle to generate the systems power. While the lightweight technology was designed for smaller UAVs, the sensors could be distributed throughout larger UAVs, depending on the application.
Flying drone
Unmanned Aerial Systems (UAS) Traffic Management
NASA Ames has developed an Autonomous Situational Awareness Platform system for a UAS (ASAP-U), a traffic management system to incorporate Unmanned Aerial Systems (UASs) into the National Airspace System. The Autonomous Situational Awareness Platform (ASAP) is a system that combines existing navigation technology (both aviation and maritime) with new procedures to safely integrate Unmanned Aerial Systems (UASs) with other airspace vehicles. It uses a module called ASAP-U, which includes a transmitter, receivers, and various links to other UAS systems. The module collects global positioning system GPS coordinates and time from a satellite antenna, and this data is fed to the UAS's flight management system for navigation. The ASAP-U module autonomously and continuously sends UAS information via a radio frequency (RF) antenna using Self-Organized Time Division Multiple Access (SOTDMA) to prevent signal overlap. It also receives ASAP data from other aircraft. In case of transmission overload, priority is given to closer aircraft. Additionally, the module can receive weather data, navigational aid data, terrain data, and updates to the UAS flight plan. The collected data is relayed to the flight management system, which includes various databases and a navigation computer to calculate necessary flight plan modifications based on regulations, right-of-way rules, terrain, and geofencing. Conflicts are checked against databases, and if none are found, the flight plan is implemented. If conflicts arise, modifications can be made. The ASAP-U module continuously receives and transmits data, including UAS data and data from other aircraft, to detect conflicts with other aircraft, terrain, weather, and geofencing. Based on this information, the flight management system determines the need for course adjustments and the flight control system executes them for a safe flight route.
small aircraft crash, handheld collision avoidance device, small craft, topography screen
Improved Ground Collision Avoidance System
This critical safety tool can be used for a wider variety of aircraft, including general aviation, helicopters, and unmanned aerial vehicles (UAVs) while also improving performance in the fighter aircraft currently using this type of system. Demonstrations/Testing This improved approach to ground collision avoidance has been demonstrated on both small UAVs and a Cirrus SR22 while running the technology on a mobile device. These tests were performed to the prove feasibility of the app-based implementation of this technology. The testing also characterized the flight dynamics of the avoidance maneuvers for each platform, evaluated collision avoidance protection, and analyzed nuisance potential (i.e., the tendency to issue false warnings when the pilot does not consider ground impact to be imminent). Armstrong's Work Toward an Automated Collision Avoidance System Controlled flight into terrain (CFIT) remains a leading cause of fatalities in aviation, resulting in roughly 100 deaths each year in the United States alone. Although warning systems have virtually eliminated CFIT for large commercial air carriers, the problem still remains for fighter aircraft, helicopters, and GAA. Innovations developed at NASAs Armstrong Flight Research Center are laying the foundation for a collision avoidance system that would automatically take control of an aircraft that is in danger of crashing into the ground and fly it—and the people inside—to safety. The technology relies on a navigation system to position the aircraft over a digital terrain elevation data base, algorithms to determine the potential and imminence of a collision, and an autopilot to avoid the potential collision. The system is designed not only to provide nuisance-free warnings to the pilot but also to take over when a pilot is disoriented or unable to control the aircraft. The payoff from implementing the system, designed to operate with minimal modifications on a variety of aircraft, including military jets, UAVs, and GAA, could be billions of dollars and hundreds of lives and aircraft saved. Furthermore, the technology has the potential to be applied beyond aviation and could be adapted for use in any vehicle that has to avoid a collision threat, including aerospace satellites, automobiles, scientific research vehicles, and marine charting systems.
Stay up to date, follow NASA's Technology Transfer Program on:
facebook twitter linkedin youtube
Facebook Logo Twitter Logo Linkedin Logo Youtube Logo