A Flight Sim Enthusiast's Notebook

中文 English 日本語 Français Deutsch Español 한국어 Русский 繁體中文

Airbus's ATTOL Autonomous Taxi, Take-Off and Landing Project

There was a lot of negative news during the pandemic, and it was impossible to go out for shooting, so the blog naturally didn’t get updated. However, there is also positive news, such as Airbus’s ATTOL autonomous taxi, take-off, and landing project: Airbus concludes ATTOL with fully autonomous flight tests.

Chinese media has also covered this extensively; here are a few excerpts.

China Aviation News Airbus completes autonomous taxi, take-off and landing project flight tests,

On June 30, 2020, following an extensive flight test program spanning two years, Airbus successfully completed its Autonomous Taxi, Take-Off and Landing (ATTOL) project.

Upon the project’s completion, Airbus utilized onboard image recognition technology to achieve the autonomous taxiing, takeoff, and landing of a civil aircraft through vision-based fully automatic flight tests, a first in the global aviation sector.

The project completed over 500 test flights, of which approximately 450 flights were dedicated to collecting raw video data to support and fine-tune algorithms, while an additional 6 sets of test flights (each consisting of 5 takeoffs and landings) were used to test autonomous flight capabilities.

The ATTOL project was initiated by Airbus to explore autonomous technologies, including the application of machine learning algorithms and automation tools in data labeling, processing, and modeling, helping pilots reduce their focus on aircraft operations and pay more attention to strategic decision-making and mission management. Airbus is now able to analyze the potential of these technologies in enhancing future aircraft operations, further improving aircraft safety while ensuring it remains at this unprecedented level.

2020-07-06 10:48 International Aviation WeChat Official Account International Aviation “Seeing”: Airbus develops machine vision-based automatic takeoff and landing system

ATTOL combines GPS with image recognition technology to determine the aircraft's relative position and guide the aircraft. The core of the ATTOL system consists of cameras, image processing algorithms, and aircraft control laws. This image recognition system can detect various markings on the Runway and subsequently infer the location of the Runway's center line.

The ATTOL system’s software is improving reliability through machine learning techniques, but it has not yet been certified. Experts indicate that in future cockpits, this new technology and traditional navigation technology will run in parallel, and the output results of both must be correlated. At the current stage, the ATTOL project still faces many challenges, such as how machine vision systems can operate effectively in low Visibility, whether airports need to paint various edge lines and markings more clearly, and how to handle identification errors.

Aviation Industry Information Network Airbus Company completes ATTOL project autonomous flight tests

The ATTOL project aims to explore how autonomous technologies (including using machine learning algorithms and automated tools for data labeling, processing, and model generation) can help pilots focus more on strategic decision-making and mission management during flight, rather than on the aircraft platform. The goal of the project is to improve the operational safety of existing passenger aircraft, but it can also be applied to new generation electric vertical take-off and landing (eVTOL) urban air transport aircraft.

ATTOL project manager Sebastien Giuliano emphasized that many aircraft are already capable of automatic landing, but they all rely on external infrastructure, such as Instrument Landing Systems or GPS signals. ATTOL’s goal is to achieve this using only onboard technology to maximize efficiency and reduce infrastructure costs.

Acubed’s Wayfinder team developed software based on computer vision and machine learning that enables aircraft to detect their surrounding environment and calculate how to navigate within it optimally. This is achieved by combining the use of sensors (including cameras, radar, laser-based LiDAR, and powerful onboard computers). Wayfinder project manager Arne Stoschek believes that the key challenge for autonomous driving functions is how the system handles unexpected events. This is a huge leap from automation to autonomy.