The project coordinator
Project map
Russian version
English version
Write the mail
Truly enhanced flight vision, better displays dominate OEMs' imaginations
News/ > 2020/ > Truly enhanced flight vision, better displays dominate OEMs' imaginations/
Truly enhanced flight vision, better displays dominate OEMs' imaginations
15 April 2020 Thanks to today’s enhanced flight vision systems (EFVS) and synthetic flight systems (SVS), pilots can fly safely in low light and obscured visibility conditions that would otherwise be life-threatening. But OEMs such as Collins, Garmin, Gulfstream, and Universal Avionics are not satisfied with the state-of-the-art. They want tomorrow’s EFVS/SVS to provide pilots with even better visibility, more vital real-time flight information, and a reduced cockpit workload. Here’s what they are doing to achieve these goals.

Augmented reality, voice recognition and room for a drink

Garmin has embraced the power of SVS with its G3000 Integrated Flight Deck that delivers EFVS data to the company’s GHD-2100 HUD. Currently undergoing initial certification for the Cessna Citation Longitude business jet, the GHD-2100 is comprised of a high-resolution glass combiner HUD display (mounted between the pilot and the window) and a slim, self-contained projection unit designed to give pilots more head room.

«Our goal is to do everything the rest of the industry is doing in EFVS/SVS in a more economical and smaller package,» said Brian Ast, Garmin’s senior aviation systems and human factors engineer. «But we are also looking at emerging consumer technologies to see which ones might be suitable in the cockpit.»

With respect to EFVS/SVS in general, Ast can see the usefulness of Augmented Reality glasses/displays — where interactive computer-generated images are overlaid over real-time views — to give pilots extra control in the cockpit. (This includes using sound waves and localized air pressure to create non-physical AR controls that can be ‘touched’ and manipulated by human hands, as in the Tom Cruise film ‘Minority Report’.)

Meanwhile, combining EFVS with voice recognition could reduce the need for pilots to look down at their control surfaces and away from the windows. >>>
>>> «By adding AR and voice recognition in the cockpit, you can boost the performance of the EFVS/SVS without saturating the pilot in a heads-down world,» said Ast. «At the same time, you have to be very selective about bringing new technology into the cockpit. Just because something is becoming commonplace in the consumer world, doesn’t mean that it has a place in aviation.»

One nice feature: The extra space allowed by the compact GHD-2100 includes an 11" gap between the glass display and the pilot, Ast said, «which is enough space to drink from a bottle of water without hitting the glass.»

Pointing is controlling

Universal Avionics (UA) does not produce EFVS for general aviation aircraft. «Due to the size, weight, and cost of traditional EFVS installations, EFVS has been limited to date to larger business aircraft and commercial airlines,» explained Marc Bouliane, Universal Avionics’ vice president of strategic business development. «But with the advent of our SkyLens Head-Wearable Display and our continuing work in EFVS, we are reducing the cost of acquisition of these systems while making them lighter and easier to install.» So far, UA has succeeded in selling these systems to ATR’s regional aircraft line and Leonardo Helicopters.

While this is going on, UA has been developing/demonstrating its concept for an Interactive Synthetic Vision System (i-SVS). «With it, a pilot can use the i-SVS’ Line-Of-Sight to point to features around the aircraft and interact with them heads-up,» said Bouliane. «For example, during search and rescue operations, the pilot could select a point in space where people need to be rescued, and couple the Flight Management System (FMS) to this location to perform an optimum approach with transition to hover.»

Like Garmin, UA is interested in harnessing voice recognition to its i-SVS to minimize the heads-down work required in the cockpit, in order to maximize pilot situational awareness and safety.

«We are also working to advance penetration in Degraded Visual Environments to provide an ‘all weather window’ to the outside world,» Bouliane said. >>>
>>> True-to-life HUD VR training

Creating an affordable, realistic trainer without using a Level D simulator isn’t easy, but Collins Aerospace has done so with its HUD VR trainer. Designed to run on a Windows gaming laptop with the user wearing 3D Oculus Rift VR goggles, the HUD VR Trainer faithfully reproduces the experience of using a Collins Aerospace HGS-6000 Head Up Guidance System in a transport aircraft. The HUD VR Trainer can also serve as a 2D training tool using the computer’s own screen or a large video display for classroom viewing.

«The HUD VR trainer began its life as a sale tool for the HGS-6000, which is why we did our best to make it so realistic — including showing the optical qualities of the actual combiner glass,» said Marc Cronen, principal program manager at Collins Aerospace. «To achieve this goal, the HUD VR Trainer runs on the same code as the HGS-6000 EFVS.»

The HUD VR is a true instructional tool with many features. The wraparound VR cockpit graphics are ultra-realistic, with the user being able to ‘rotate’ the glass combiner screen out of their sightline, just as they could in a real aircraft. The HUD VR’s operating modes include Standard (simply displaying HGS-6000 symbology as the simulation progresses), Synthetic Vision (the symbology on top of a computer-generated terrain map), Enhanced Vision (the symbology combined with simulated EVS sensor readings tied to the program’s out-of-the-window imagery), and the Combined Vision mode that combines the three other modes together.

The HUD VR Trainer is well-suited to the aviation industry’s demand for HUD-trained pilots, «particularly in China, where the government has mandated that all transport aircraft be equipped with HUDs by 2025,» said Cronen. «This is going to drive a huge demand for HUD training globally, and the HUD VR Trainer can meet this need in a cost-effective manner.»

Confronting the next great challenge

Gulfstream is rightly proud of its role as an EFVS pioneer. «In 2018, we were the first OEM to be certified under the FAA’s EFVS-to-land regulations, where the Gulfstream G500 could be landed by the pilot relying solely on the EFVS’s readouts in poor weather,» said Jeff Hausmann, Gulfstream's director of advanced flight deck programs. «The same EFVS-to-land capability also exists in our G600 and G700 business aircraft. We consider this capability to be essential in today’s business aviation world where customers want to fly whenever they choose to; no matter how poor the visibility.»

Hausmann credits Gulfstream’s EFVS capability to its use of cooled IR sensors. «It’s just a matter of physics: Cooled IR sensors are more sensitive than the uncooled IR sensors used by other EFVS systems,» he said. >>>
>>> That said, Hausmann sees a challenge looming on the EFVS horizon, due to the push by airports to go from incandescent to long-lasting LED runway lighting. «Airports are changing their lights not to save on power costs, but rather the human resources costs associated with changing burnt-out incandescent lights,» he told Avionics International.

Whatever the reason, an EFVS’ IR sensors rely on the heat emitted by incandescent runway lights to ‘see’ the runway. When those bulbs are replaced with LEDs, the lack of heat can blind the EFVS.

As a result of this trend, EFVS OEMs are seeking other ways to see runways equipped with LEDs. «We are focusing on millimeter length radar to solve the problem,» said Hausmann. «But this is a big technical challenge for EFVS equipment, and I don’t believe it’s been solved yet.»

Certification of combined vision system expected soon

Bombardier Aviation expects the certification of the Combined Vision System (CVS) for the company's Global 5500 and 6500 business jets soon, the company said on Feb. 6. The Federal Aviation Administration certified the Global 5500 and 6500, equipped with the Bombardier Vision Flight Deck and the Collins Aerospace Pro Line Fusion avionics suite, in December. Over the last two years, Bombardier and Collins Aerospace have also been developing CVS for the jets — a system that Bombardier Aviation President David Coleal has heralded as the «first true combined vision system» developed solely for business jets.

CVS is to merge infrared enhanced vision system (EVS) and synthetic vision system (SVS) imagery into a single conformal view to increase safety and reduce pilot workload formerly required by toggling between EVS and SVS.

Mathieu Noel, Bombardier Aviation's director of product strategy and design, revealed to Avionics International last October that CVS will mean that pilots will not have to choose between EVS and SVS, «as the best view is always displayed,» and that pilots using CVS will glean «much more information from a single glance.» >>>
>>> Airbus ATTOL project lead describes technology behind vision-based A350 takeoff

Using a combination of image recognition technology and flight control computer modifications, Airbus successfully performed its first ever fully automatic vision-based takeoff with an A350-1000 in December. Avionics International recently caught up with Sebastien Giuliano, project leader for the Autonomous Taxi, Take-Off & Landing (ATTOL) project at Airbus to learn how the vision-based test flight was made possible.

The ATTOL project is a technological flight demonstrator initiative that first began at Airbus in June 2018, as part of the French airplane maker’s goal of understanding the impact of using increased autonomy on aircraft. During a flight test on December 18, a crew including two pilots and three flight test engineers performed a total of eight take-offs and landings to achieve a major milestone for ATTOL to use image recognition technology in place of an Instrument Landing System (ILS) to perform an automatic takeoff.

According to Giuliano, the crew used several system modifications and camera upgrades on the A350 to perform the test flight.

«Avionics upgrades were limited to the flight control computer, and additional modifications were linked to the installation of cameras and additional computing capabilities linked to those cameras. This was done on an on-boarded demonstration platform which was linked to the modified Flight Control Computer but was not an avionics grade platform,» Giuliano said.

Airbus has not released the name of the company that supplied the image recognition technology for the flight, however Giuliano describes it as «state of the art computer vision and hardware techniques that can be found in different industries, including the automotive industry.» Engineers from several different divisions across Airbus participated in acquiring the image recognition technology and installing it on the aircraft.

«The cameras were adapted to our use case and environment. Those development and adaptations were developed by a team of Airbus engineers from different divisions — Airbus UpNext, Airbus Commercial, Airbus Defense and Space and A3 — and some support from ONERA as a subcontractor mostly for data fusion of existing parameters with vision-based parameters,» Giuliano said. >>>
>>> Giuliano also emphasized that the flight test is different from what could also be described as an automatic takeoff, where the positioning of the aircraft relies solely on the use of an ILS. An in-cockpit video of the pilots taking off also directly points to their hands not being on the controls as they taxi down the runway and prepare to takeoff.

Yann Beaufils, one of the two test pilots that participated in the flight test described the process in a statement published by Airbus.

«While completing alignment on the runway, waiting for clearance from air traffic control, we engaged the auto-pilot. We moved the throttle levers to the take-off setting and we monitored the aircraft. It started to move and accelerate automatically maintaining the runway center line, at the exact rotation speed as entered in the system. The nose of the aircraft began to lift up automatically to take the expected take-off pitch value and a few seconds later we were airborne,» Beaufils said.

The vision-based automatic takeoff was the latest achievement in the Airbus A350 program, after Airbus completed the delivery of the first A350 featuring touchscreen displays to China Eastern Airlines. By mid-2020, the ATTOL project team hopes to achieve its next milestone: automatic vision-based taxi and landing sequences.

For more information, please visit the following links:




Search on the project
© 2020 State Research Institute of Aviation Systems. All rights reserved. Terms of Use.