Aviatar- An Augmented Reality System to Improve Pilot Performance for Unmanned Aerial Systems
In the modern airspace, small unmanned aircraft systems (UAS) such as drones are becoming increasingly popular with both amateur enthusiasts as well as professional pilots. In the three years following the initiation of the small drone registration rule in 2015, over one million drones were registered in the U.S. alone. By 2022, the United States Federal Aviation Administration estimates that the number of registrations could exceed 3.8 million UAS. In recognition of the necessity to integrate sUAS traffic into the national airspace system, Congress passed the FAA Modernization and Reform Act of 2012, which created the mandate for the FAA to regulate sUAS operation in United States national airspace (NAS). This also created a number of obligations for drone pilots, including avoidance of restricted airspace, maximum flight levels, safe separation from aircraft - including other UAS - as well as avoiding flight over civilian human population and contact with personal property such as buildings or cars. These new obligations associated with safety coupled with the goals of recreational or commercial flight act to degrade pilot situational awareness. Because of the nature of flying a drone either for pleasure or commercial purpose, it is very easy for operators to lose their situational awareness (SA) of the environment around them. A study published by the NASA Langley Research Center (LaRC) in 2017 found that the majority of commercial aviation accidents not attributable to aircraft systems failure involved the crew’s loss of SA of the aircraft or the environment, and that crew distraction from operation was associated with all of these accidents. If this is the case with commercial aircraft pilots inside of an enclosed aircraft cockpit in relative isolation, it is easy to imagine that the potential for distraction in the UAS environment is at least great. This demonstrates the potential for a decreased SA state to create an unsafe environment for other pilots and bystanders and lead to fines and penalties for the drone pilot if damage, injury, or disruption to the airspace occurs. While many times a pilot or flight crew can be distracted by agents not directly associated with the operation of an aircraft, there are many flight phenomena that can require a pilot’s focus to manage. This focused attention can also degrade a pilot’s SA. One mode of pathological flight phenomena in fixed-wing aircraft is that of pilot-induced oscillation (PIO). These PIO can occur either as a result of pilot-airframe coupling as in the case of biodynamic feedthrough, or as a result of the lag between pilot observation and action and the propagation of the pilot’s actions and the control response of the aircraft under the influence of structural or environmental stimulus on the aircraft system. Under either scenario, the actions necessary to identify and resolve of these PIO can quickly distract the pilot and cause a degradation of pilot SA level. This pilot distraction can lead to mission task element (MTE) failure, loss of aircraft control, and damage or destruction of the aircraft and surrounding persons and property. In this study, we make an effort to identify the state of the art in pilot situational awareness research and to understand the critical pilot-aircraft interactions that are at the forefront of research in this field. Pilot induced oscillations, especially of the Type III nonlinear family, are one such topic that researchers have worked on heavily for over 40 years and have made minimal progress in solving. In fact, the addition of autonomous control functions to modern avionics systems, such as control surface rate limiting features, have increased the severity associated with these Type III PIO when they occur. To broaden the context of our work, similar issues are faced in a variety of other vehicle control situations including ships, cars, farm equipment, and large trucks.To support our goal of improving the experience, productivity, and safety of remote UAS pilots, we implemented a see-through augmented reality headset system, AviatAR, to provide information to the pilot with minimal detrimental effects. We also created a method, the Flight-Space Volume Model (FSVM), to enable accurate placement of an augmented reality cue that we refer to as a Gizmo in the pilot’s visual field. This cue serves to aid in pilot accuracy and notification of predicted pilot-induced oscillations to enable a pilot to actively react this phenomena in the early stages of formation. During our research, one issue that we noticed was a dearth of research on pilot SA and PIO phenomena associated with multirotor UAS systems. Because of the increase in the number of UAS systems available for both personal and commercial use, we felt that there were many contributions that could be made to this several fields of study. Thus, our research is a multidisciplinary work that supports three fields of research: computer science, cognitive science, and aeronautics. In our experiments, we evaluated the performance of a pilot flying a low-autonomy unaugmented quadcopter in an outdoor, uncontrolled setting for both an unaided pilot as well as one equipped with three different evolutions of the AviatAR head-mounted display. In each of these evolutions, we included both the Gizmo component alone as well as two variations of the PIO cue, both in the superior and inferior peripheral visual fields, and evaluated pilot performance in terms of positional accuracy. Our analysis demonstrates that a pilot equipped with the AviatAR display outperformed the unaided pilot, and that the addition of the PIO cue further enhanced pilot performance with respect to accuracy. Further, we demonstrated that placing a persistent low-complexity informational cue in the inferior peripheral visual field of the pilot significantly improved pilot accuracy over the superior field cue, which leads to future work to better understand how low-complexity inferior peripheral field cues can be used to enhance the performance and improve situational awareness of equipment operators in a broad range of applications.