Synthetic vision system


A synthetic vision system is a computer-mediated reality system for aerial vehicles, that uses 3D to provide pilots with clear and intuitive means of understanding their flying environment.
Synthetic vision is also a generic term, which may pertain to computer vision systems using artificial intelligence methods for visual learning, see "Synthetic Vision using Volume Learning and Visual DNA".

Functionality

Synthetic vision provides situational awareness to the operators by using terrain, obstacle, geo-political, hydrological and other databases. A typical SVS application uses a set of databases stored on board the aircraft, an image generator computer, and a display. Navigation solution is obtained through the use of GPS and inertial reference systems.
Highway In The Sky, or Path-In-The-Sky, is often used to depict the projected path of the aircraft in perspective view. Pilots acquire instantaneous understanding of the current as well as the future state of the aircraft with respect to the terrain, towers, buildings and other environment features.

History

A forerunner to such systems existed in the 1960s, with the debut into U.S. Navy service of the Grumman A-6 Intruder carrier-based medium-attack aircraft. Designed with a side-by-side seating arrangement for the crew, the Intruder featured an advanced navigation/attack system, called the Digital Integrated Attack and Navigation Equipment, which linked the aircraft's radar, navigation and air data systems to a digital computer known as the AN/ASQ-61. Information from DIANE was displayed to both the Pilot and Bombardier/Navigator through cathode ray tube display screens. In particular, one of those screens, the AN/AVA-1 Vertical Display Indicator, showed the pilot a synthetic view of the world in front of the aircraft and, in Search Radar Terrain Clearance mode, depicted the terrain detected by the radar, which was then displayed as coded lines that represented preset range increments. Called 'Contact Analog', this technology allowed the A-6 to be flown at night, in all weather conditions, at low altitude, and through rugged or mountainous terrain without the need for any visual references.
Synthetic vision was developed by NASA and the U.S. Air Force in the late 1970s and 1980s in support of advanced cockpit research, and in 1990s as part of the Aviation Safety Program. Development of the High Speed Civil Transport fueled NASA research in the 1980s and 1990s. In the early 1980s, the USAF recognized the need to improve cockpit situation awareness to support piloting ever more complex aircraft, and pursued SVS as an integrating technology for both manned and remotely piloted systems.

Simulations and remotely piloted vehicles

In 1980 the FS1 Flight Simulator by Bruce Artwick for the Apple II microcomputer introduced recreational uses of synthetic vision.
Remotely Piloted Aircraft Cockpit with Synhthetic Vision Display
NASA used synthetic vision for remotely piloted vehicles, such as the High Maneuvability Aerial Testbed or HiMAT. According to the report by NASA, the aircraft was flown by a pilot in a remote cockpit, and control signals up-linked from the flight controls in the remote cockpit on the ground to the aircraft, and aircraft telemetry downlinked to the remote cockpit displays. The remote cockpit could be configured with either nose camera video or with a 3D synthetic vision display. SV was also used for simulations of the HiMAT. Sarrafian reports that the test pilots found the visual display to be comparable to output of camera on board the RPV.
The 1986 RC Aerochopper simulation by Ambrosia Microcomputer Products, Inc. used synthetic vision to aid aspiring RC aircraft pilots in learning to fly. The system included joystick flight controls which would connect to an Amiga computer and display. The software included a three-dimensional terrain database for the ground as well as some man-made objects. This database was basic, representing the terrain with relatively small numbers of polygons by today's standards. The program simulated the dynamic three-dimensional position and attitude of the aircraft using the terrain database to create a projected 3D perspective display. The realism of this RPV pilot training display was enhanced by allowing the user to adjust the simulated control system delays and other parameters.
Similar research continued in the U.S. military services, and at Universities around the world. In 1995-1996, North Carolina State University flew a 17.5% scale F-18 RPV using Microsoft Flight Simulator to create the three-dimensional projected terrain environment.

In flight

In 2005 a synthetic vision system was installed on a Gulfstream V test aircraft as part of NASA's "Turning Goals Into Reality" program. Much of the experience gained during that program led directly to the introduction of certified SVS on future aircraft. NASA initiated industry involvement in early 2000 with major avionics manufacturers.
Eric Theunissen, a researcher at Delft University of Technology in the Netherlands, contributed to the development of SVS technology.
At the end of 2007 and early 2008, the FAA certified the Gulfstream Synthetic Vision-Primary flight display system for the G350/G450 and G500/G550 business jet aircraft, displaying 3D color terrain images from the Honeywell EGPWS data overlaid with the PFD symbology.
It replaces the traditional blue-over-brown artificial horizon.
In 2017, Avidyne Corporation certified Synthetic Vision capability for its air navigation avionics.
Other glass cockpit systems such as the Garmin G1000 and the Rockwell Collins Pro Line Fusion offer synthetic terrain.
Lower-cost, non-certified avionics offer synthetic vision like apps available for Android or iPad tablet computers from ForeFlight, Garmin, or Hilton Software

Regulations and standards

*