• Nie Znaleziono Wyników

Design of a synthetic vsion overlay for UAV autoland monitoring

N/A
N/A
Protected

Academic year: 2021

Share "Design of a synthetic vsion overlay for UAV autoland monitoring"

Copied!
11
0
0

Pełen tekst

(1)

Design of a Synthetic Vision overlay for UAV autoland monitoring

Jochum Tadema

*a

, Erik Theunissen

a,b

a

Netherlands Defence Academy, Het Nieuwe Diep 8, 1781 AC, Den Helder, The Netherlands;

b

Delft University of Technology, Mekelweg 4, 2628 CD, Delft, The Netherlands

ABSTRACT

For Unmanned Aerial Vehicles (UAVs), autonomous forms of autoland are being pursued that do not depend on special, deployability restraining, ground-based equipment for the generation of the reference path to the runway. Typically, these forms of autoland use runway location data from an onboard database to generate the reference path to the desired location. Synthetic Vision (SV) technology provides the opportunity to use conformally integrated guidance reference data to ‘anchor’ the goals of such an autoland system into the imagery of the nose-mounted camera. A potential use of this is to support the operator in determining whether the vehicle is flying towards the right location in the real world, e.g., the desired touchdown position on the runway. Standard conformally integrated symbology, representing e.g., the future pathway and runway boundaries, supports conformance monitoring and detection of latent positioning errors. Additional integration of landing performance criteria into the symbology supports assessment of the severity of these errors, further aiding the operator in the decision whether the automated landing should be allowed to continue or not. This paper presents the design and implementation of an SV overlay for UAV autoland procedures that is intended for conformance and integrity monitoring during final approach. It provides preview of mode changes and decision points and it supports the operator in assessing the integrity of the used guidance solution.

Keywords: UAV, autoland, conformal integration, guidance reference data, integrity assessment

1. INTRODUCTION

1.1 Role of the human operator

In manned aviation, the main reason for the development of an autoland capability was to be able to land in reduced visibility. For over thirty years already, the Instrument Landing System (ILS) Cat III autoland system provides aircraft with the capability to land under zero visibility conditions1. For UAVs, autonomous forms of autoland are being pursued

that do not depend on special, deployability restraining, ground-based equipment for the generation of the reference path to the runway. In contrast to commercial aviation, the main reason for developing a UAV autoland capability is to reduce the mishap rates associated with the landing2. This raises the question whether there still is a role for the human operator

during the landing, and if so, what this role is. Factors that influence the answer comprise the integrity and reliability of the autoland function and its dependency on operator involvement for interpretation and integration of required information.

Rasmussen3 distinguishes between three different levels at which tasks can be performed: the skill-based, rule-based and

knowledge-based level. This classification is useful for structuring the analysis regarding the allocation of the tasks required for the landing of a UAV. Given that the control function is fully automated, the operator has no tasks at the skill-based level. The next level to be addressed is the rule-based level. Here operator involvement would be at the decision-making level, using pre-defined criteria. For UAV autoland, the requirements to commence and continue a landing and the factors that require a missed approach need to be defined. The decision cycle then involves assessment of the system’s compliance to these ‘rules’. This requires information from various sources, for example: information on vehicle status, environmental conditions, operational factors, datalink status, guidance status and runway status. If not all of the required information is directly and in the appropriate format available to the automated system, or in case some of the rules or associated exceptions vary and can not be integrated into the automated system, involvement of the human operator is required. In the current study, it is hypothesized that the human operator can contribute to the overall system performance by applying the capability to integrate and compare information from dissimilar sources. Especially when it

*

(2)

comes to interpretation and integration of information from the nose-mounted camera on e.g., guidance and runway status, human operator involvement is believed to have advantages over current generation machine vision capabilities. A common factor in most UAV autoland concepts is the use of runway location data from an onboard database to generate the reference path to the desired location. As a result, the integrity of this guidance reference is a function of the integrity of the onboard position estimation and reference position database. This introduces the possibility of latent failures that cause the UAV to be commanded towards a location outside of the allowed touchdown area. Through assessment of the information obtained from the nose-mounted camera, together with the guidance reference data and the vehicle state data, the human operator can detect path definition and navigation sensor errors which may or even can not be caught by the on-board guidance system. In the envisioned concept of operations, the operator has to determine whether the vehicle is flying towards the correct location in the real world, e.g., the desired touchdown position on the runway.

In general, we assume that the primary role of the human operator will be conformance and integrity monitoring. Following this hypothesized role, the main question to be addressed is: “How to support the operator in the detection and comprehension of abnormal flight conditions, in order to make the decision to consent to, or intervene with the automated landing process?”.

1.2 Potential of spatially integrated presentation

Several new display concepts intend to provide operator support by showing the future flight path by means of a perspectively projected pathway. For manual control of manned aircraft, advantages relative to the conventional Flight Director (FD) command display have been demonstrated in the following areas:

• reduced control effort because of the preview on changes in the path which allows the operator to better distinguish between control required for guidance and control needed to compensate for disturbances4;

• ability to apply other control strategies than just compensatory control where the operator basically acts as a servo by continuously trying to zero the error4;

• better tracking performance, specifically for more complex trajectories5.

Additionally, it has been demonstrated that for the manual control of a UAV, with associated limitations of data update rate and system latency, a perspective guidance display is, in terms of precision tracking performance and required control effort, consistently superior to an FD command display6. This raises the question whether there is a potential for

conformal integration of the trajectory data in the area of supervisory control as well; more specifically, to support conformance and integrity monitoring of an autoland system.

Monitoring the progress of an automated process comprises validating the desired future state (objective, goal) and determining whether the current state lies within the allowed margins. In practice, this means the operator monitoring the autoland system has to determine whether:

• scheduled goals, such as future flight path and flight mode, are in compliance with the requirements defined in the applying rules;

• current control actions yield the required results, e.g., sufficient autopilot tracking performance.

The applying rules generally depend on the phase of the flight. This means that to support the operator in monitoring the autoland system, the user interface must provide awareness of i) the currently active phase and corresponding goals, ii) whether the required goals are met, and iii) the conditions that need to be met in order to transition to the next phase. The conventional approach to support conformance monitoring is to depict the status of the various parameters on a scale with an indication of the margins within which the values need to remain. With conventional Head-Up Display (HUD) symbology, an important element that indicates autopilot tracking performance is the FD. In our research into display concepts for manned aircraft we have concluded that in case the pilot is required to take over from the automatic guidance system, the perspective presentation of the future path can be advantageous. When integrating information about the scheduled actions of the autopilot, it provides an excellent means to determine whether the autopilot’s actions are in line with the required guidance. It is expected that this advantage also exists in case the operator performs conformance monitoring on a supervisory level without being required to revert to manual control. The underlying assumption is that, using conformally integrated guidance reference data, trend information on the lateral and vertical

(3)

tracking performance of the autopilot can be obtained from a single snapshot of the situation. In contrast, with two separate indicators as in a conventional FD, only the current state can be observed and the operator needs to derive trend information from the motion of the indicators relative to the margins. This requires a longer time for perception of the required information. Additionally, an integrated preview of the flight path and its constraints allows the operator to anticipate upcoming changes in position margins. For conformance monitoring, no direct quantitative estimate is required, the ability to determine whether there will be an excursion of the margins suffices.

Another advantage is that conformal integration of the trajectory data enables intuitive anticipation of scheduled actions and decision points as it provides the possibility to display indications of such events connected to the depiction of the future flight path, at the location they become relevant. It is believed that this preview increases total system predictability, resulting in better level 3 Situation Awareness (SA). In case of abnormal flight conditions, a better level 3 SA should manifest itself through earlier detection and comprehension of such events.

The ability of the human operator to timely assess the integrity of the reference path, based in the information from the nose-mounted camera, is determined by the type and magnitude of the error and the additional information that is presented. With conventional HUD symbology, an important element that supports guidance integrity assessment is the Flight Path Marker (FPM), which indicates the inertial direction of flight. On the final, straight part of the approach, the FPM should be pointing at the desired touchdown location. However, the relative location of the FPM and runway in the sensor image is influenced by the path steering error, making it harder for the pilot to distinguish between temporary but acceptable steering errors and an integrity problem. Using conformally integrated guidance reference data to ‘anchor’ the goals of the autoland system into the imagery of the nose-mounted camera, provides an opportunity to support guidance integrity assessment in a way not influenced by the steering error component. Rather than using the information about the impact of the guidance reference on elements of the momentary vehicle state (represented by the FPM), the guidance reference itself (i.e., the perspective representation of approach path and runway) can be checked against the real world constraints. In the absence of path definition and navigation system errors, the depiction of the guidance reference is aligned with the real runway. In ref. 7 this concept has been explored for manned aviation.

1.3 Level of operator involvement

When designing the user interface for a system with a certain level of autonomy, it needs to be established at what levels the human operator has the authority to intervene and how different levels of operator involvement influence performance. To address these aspect in this study, two different interaction concepts have been identified based on Sheridan’s eight Levels of Authority (LOAs)8. With a lower level of system authority (LOA 4), the autoland function

always requires the operator’s approval to transition to the next flight phase, even in normal operation, when no problems are detected. With a higher level of system authority (LOA 5), the system automatically transitions to the next phase unless the operator intervenes. Alternatively, these modes of operation can be classified as management by consent and management by exception9.

An autoland function design based on the consent principle might keep the operator more involved in the decision to land. By approving the system to transition from approach mode to landing mode the operator explicitly takes responsibility for the decision to land. On the other hand, compared to the exception principle, it does involve additional operator interaction for a nominal situation. This might lead to perfunctory decisions and increased system opacity.

2. INITIAL DESIGN

Based on the identified potential of conformally integrated trajectory preview for autoland conformance and integrity monitoring, this section covers the initial design of the UAV autoland user interface. Furthermore a concept that supports the operator with the identification of certain sensor failures will be discussed.

2.1 Conformance monitoring

Figures 1 to 5 present the initial design of the graphical user interface (GUI). It comprises a display format (a) that serves as an overlay of the imagery of the forward looking, nose-mounted camera and a simplified control panel (b) that allows the operator to interact with the system. The control panel has been designed to provide feedback to the operator about the status of the autoland process and to allow the operator to set the required states. To support the operator in quickly identifying the current status, the data presentation reflects the dependency between the different elements and the required state. In this study, the chosen elements, required states and their relationships serve an illustrational purpose

(4)

I

I

GUAPUUNDC GOARUUNLC

only. To prevent the need for switching attention between displays for obtaining information on the flight status, the indicators for the phases are presented both on the autoland control panel and in the sensor overlay.

The overlay provides the operator with preview on the future flight path by means of a conformally integrated pathway. This supports validation of current and scheduled flight profile transitions, such as an upcoming transition to the glide path (Figure 1), or an immediate transition to the missed approach path (Figure 5). The tunnel dimensions provide information on the spatial constraints, e.g., the maximum allowable tracking errors for final approach as specified in the abort criteria (Figure 2). Information about velocity vector, speed deviations and acceleration/deceleration is provided by means of the FPM with color coded speed error bar and acceleration caret. Also, an aircraft attitude reference symbol is included (waterline symbol). To support assessment of the vehicle’s attitude in the final phase of the landing, the attitude constraints appear when the automatic flare is initiated (Figure 4). Although from an information analysis point of view, the operator would not need quantitative state information, communication with ATC may require the depiction of some numerical data. Therefore heading, speed and altitude readouts are included in the overlay as well.

When the LOA is chosen such that operator consent is required transitioning between flight phases, annunciators indicating the spatial location of the corresponding decision points (e.g., decision height) are integrated into the perspective world. Similar to the depiction of scheduled changes in the flight profile, the resulting ‘billboard-like’ signs provide the operator with an intuitive means to establish the relationship between the spatial and temporal domain, increasing system predictability and thus supporting anticipatory behavior. As an example, the billboards depicted on the right side of the glide path in Figure 2 indicate the interval at which the operator is required to provide consent.

2.2 Integrity monitoring

To support detection of latent errors in the guidance solution, conventional HUDs show a runway outline that is obtained by using the glide slope and localizer signals to compute the required transformations that need to be applied to a 2-D shape representing the runway. In the concept presented in this paper, the depiction of the runway outline (Figure 3) is computed from a 3-D database containing runway location. The pathway to the runway (Figures 1 and 2) is computed from the same reference data that is used by the system that computes the control commands. Since the conformal integration of these data in the sensor image is based on the vehicle state reported by the navigation system, misalignment between the glide path symbology, runway outline symbology and the real runway indicates the presence of path definition and/or navigation sensor errors.

Fig. 1 a/b. Approach mode ready to be initiated, meaning the

transition to glide path is scheduled Fig. 2 a/b. Glide path intercepted: approach mode active; go-around mode ready to be initiated, should abort criteria be met

(a)

(b)

(a)

(5)

— ,

— ___

BOAR BUN

BOAR BUN EJ•

Fig. 3 a/b. Runway confirmed: land mode active; go-around

mode ready to be initiated, should abort criteria be met Fig. 4 a/b. Attitude margin indication during flare; land mode active; go-around mode ready to be initiated, should abort criteria be met

Fig. 5 a/b. Go-around mode initiated: missed approach path

appears Fig. 6 a/b. Gyro system failure identification: lane 2 contains a 1.5 degree roll error (a) (b) (a) (b) (a) (b) (a) (b)

(6)

2.3 Additional opportunity: fault identification and isolation

Fault identification and isolation requires an architecture in which the output of at least three independent but functionally identical systems can be compared. If, for whatever reason (design choices or a previous lane failure in a fail-operational system), no automatic fault identification is available (fail-passive configuration), there is an opportunity for the human operator to increase system performance by assessing the information that is available, through comparison of the nose-cam video with instrument data.

To explore this idea, functions were implemented to simulate gyro system lane failures and to depict resulting failure data in the sensor overlay. Figure 6 illustrates this concept; it represents a situation in which a failure has occurred in one of the lanes of a fail-passive gyro system. Since the system is fail-passive, it can automatically perform fault detection, but no identification and isolation. By presenting the output of both (conflicting) attitude lanes as an artificial horizon overlaying the sensor image, the operator might be able to successfully select the correct lane based on the real horizon, creating the possibility to continue the landing, albeit in single-lane configuration.

To fully focus the attention on the failure identification and isolation problem, no unnecessary symbology is presented in such cases, meaning the perspective pathway, FPM and waterline symbology disappear. To stress the fact that there is uncertainty about the correctness of the presented attitude data, the artificial horizons are depicted as dashed lines. After selection of a lane, the display reverts to its default mode.

2.4 Feasibility

The perspective pathway and runway outline are rendered by means of a wireframe representation, as can be observed from the example figures. All required transformations, the perspective projection and the clipping are performed in software. Consequently, these elements of the overlay concept can be implemented on any graphics system that is capable of rendering 2-D vectors. Implementation of the billboards requires a graphics system that is also capable of texture mapping.

3. REFINEMENTS

This section describes the refinements made to the initial design, based on experiences gained in the testing phase. The purpose of the refinements was to address occlusion issues and increase support in the guidance integrity assessment task.

3.1 Addressing occlusion issues

When on the glide path (Figure 2), the tunnel dimensions represent the abort criteria with respect to the maximum allowable tracking errors. Although this information is relevant for conformance monitoring, the depiction of the tunnel appeared to cause some interference with the visual information required for the guidance integrity assessment. To address this issue, the intensity of the tunnel symbology was made dependent on the magnitude of the tracking error. In this way, attention is only drawn away from the integrity monitoring task in case of a non-negligible tracking error (conformance issue).

To further reduce a potential occlusion of the visual runway, FPM and waterline symbology were chosen to be transparent. Additionally, it was decided to only depict the lower half of the tunnel. In this configuration, awareness or the upper position bounds is lost, but awareness of the more critical lower and lateral limits remains. The resulting display format is illustrated in Figures 7 and 8.

(7)

-

E D

P-1

Im

-

_

Fig. 7. Relatively large tracking error: path solid Fig. 8. Negligible tracking error: path transparent

3.2 Aiding integrity assessment

Paragraph 2.2 described how the initial design accommodates integrity monitoring of the guidance solution by supporting the detection of path definition and navigation sensor errors. However, the initial design does not support the operator in distinguishing between minor approach irregularities and significant problems; i.e., it lacks supporting the assessment of the severity of such an integrity problem. Assessment of the integrity of the reference path used by the autoland function involves two steps:

• detecting potential discrepancies between the runway depicted in the nose-cam video and the associated HUD-symbology (synthetic glide path and runway contour);

• deciding whether detected discrepancies exceed the landing performance criteria, or Touch Down Zone (TDZ) limits.

Thus, to support the operator in this assessment task, information on the TDZ limits should be provided. One way of achieving this is to integrate the TDZ dimensions into the synthetic runway outline of the concept illustrated in Figure 3, to form a synthetic ‘runway containment contour’. This runway containment contour replaces the original runway outline. In the resulting concept, the vehicle is heading towards a position within the allowed TDZ when the runway depicted in the nose-cam video is enveloped by the synthetic contour. To address experienced issues with the perceptibility of the type of position errors causing short-landings, additional markers have been implemented that bring the required cues more into the foreground of the perspective world. This idea is exemplified in Figure 9.

(8)

X±E 2OL uw 1Db JI q!L6C!OL Iou UWR 1Db q!L6C!OL X±EI!W!T puq W9LKGL2 LnUM97 COU9IULiJeu COUOflL oLlalus

— u

Oflfi!U8

Fig. 9. Integration of TDZ dimensions into a runway containment contour

Fig. 10. Advanced display concept: conformal integration of landing criteria

Fig. 11. Conventional display format

Figure 10 presents the implementation of the runway containment contour as a refinement of the initial design. The resulting advanced display concept features a conformal integration of the guidance reference and landing criteria with the sensor image. The concept primarily employs symbology with the same dynamic behavior as the world based reference points contained by the video and does not depend on specific markers on the runway surface.

4. EVALUATION AND RESULTS

4.1 Evaluation

To compare the performance of the advanced display concept (Figure 10) with a conventional display format (Figure 11), and to investigate the impact of the level of operator involvement (LOA4 vs. LOA5), experiments have been conducted in which participants monitored the progress of a simulated automatic UAV landing. The primary task of the participants was to assess the integrity of the guidance reference data used by the autoland function during the approach. In some of the approaches, deliberate position database errors were introduced, some of which causing the UAV to be commanded to a position outside of the allowed TDZ. In case the operator believed the UAV would land outside of the TDZ, a go-around had to be initiated; otherwise the landing had to be continued. The feasibility of the failure identification and isolation concept was explored by introducing gyro system failures as illustrated in Figure 6. In such events, the operator had to identify and select the correct lane.

(9)

The experiments were conducted on the UAV Control Station (UCS) research simulator at the Royal Netherlands Naval College (RNLNC) and on a mobile UCS concept demonstrator at the Air Operations Control Station Nieuw Milligen (AOCS NM). Both set-ups contained a head-level display and a touch-screen multi-function display. The head-level display provided the participants with simulated monochrome video imagery from the UAV’s fixed, nose-mounted camera, along with guidance and status symbology overlay10. The simulated imagery was created from a realistic model

representing the Mojave Desert region. The touch-screen display below the head-level display provided the simplified control panel. Figures 12 and 13 depict both UCS simulators.

The experiment involved a total of 52 participants, including naval helicopter pilots, a commercial airline pilot, a general aviation pilot, Command and Control (C2) personnel, Military Air Traffic Control (MilATC) personnel and non-operational personnel.

Fig. 12. Pilot position of the UCS research simulator at the RNLNC

Fig. 13. UCS concept demonstrator at the AOCS Nieuw Milligen

4.2 Results

The results show that the explicit depiction of the available TDZ margins reduces the variability in the decision whether to continue or abort the automatic landing11. For the scenarios tested, performance data show that the advanced display

concept provides a reduction of the miss rate with a factor six to seven without an increase in false alarm rate.

Differences for the level of operator involvement were less pronounced. However, comments from the subjects indicate that the intended type of operator should not be neglected when deciding on the LOA assigned to the autoland function. Although in this particular experiment the conformance monitoring task was not directly evaluated, several participants indicated to appreciate the increase in predictability provided by the depiction of the future flight path in the advanced display concept. This was especially the case for the anticipation of the location of the transition to glide path, which with the conventional display more or less came as a surprise.

In about 95% of the events in which a gyro failure occurred the operator was able to successfully identify and select the correct lane.

(10)

1.

LDHj

CONflRIVi1

103

1997

5. SUMMARY AND CONCLUSIONS

An advanced display concept for UAV autoland integrity and conformance monitoring is presented. In this concept, the system goals and landing criteria are explicitly anchored into the outside world using a conformally integrated depiction of the guidance reference data and the available margins (Figure 14). In addition, the level of operator involvement is addressed.

The advanced display concept provides a reduction of the miss rate with a factor six to seven without an increase in false alarm rate. Since the concept can be implemented as an overlay on the sensor image and relies only on simple transformations, it can be realized on any existing platform capable of drawing 2-D vectors.

Fig. 14. Example of the advanced overlay to assess runway database integrity. The location of the runway containment contour is computed from the location of the runway in the database that is used for guidance computation. Consequently, with the conformal integration, the goals are visually anchored in the sensor image. The human operator can detect an integrity error in the guidance data by checking whether the runway in the sensor image lies inside or outside the box representing the allowed margins. The billboard with DH (decision height) provides the operator with preview on the location before which the runway integrity must have been assessed.

(11)

ACKNOWLEDGEMENTS

The authors would like to express their gratitude to the staff and participants of AOCS NM (Royal Netherlands Air Force), squadrons VGSQ 7 and VGSQ 860 of Naval Airbase De Kooy (Royal Netherlands Navy) and VLM Airlines for contributing to this project. Special thanks go out to Maj. Bas Lippe and Capt. Guus Henkens of the Knowledge and Innovation Centre of AOCS NM for their great efforts in supporting our research program.

REFERENCES

1. J. Charnley, “Navigation Aids to Aircraft All Weather Landing”, Journal of the Royal Institute of Navigation, Vol. 42, No. 2, pp. 161-186 (1989).

2. Office of the Secretary of Defense, Unmanned Aerial Vehicle Reliability Study, pp. 56., 2003.

3. J. Rasmussen, Information Processing and Human Machine Interaction: An Approach to Cognitive Engineering, Elsevier, New York, 1986.

4. E. Theunissen, Integrated Design of a man-machine interface for 4-D navigation, Delft University Press, Delft, 1997.

5. D. Regal and D. Whittington, “Guidance symbology for curved flight paths”, Proceedings of the 8th International Symposium on Aviation Psychology, 74-79 (1995).

6. J. Tadema, E. Theunissen and G.J.M. Koeners, “Using perspective guidance overlay to improve UAV manual control performance”, Proceedings of SPIE, Vol. 6559, 65590C (2007).

7. E. Theunissen, F.D. Roefs, G.J.M. Koeners, R.M. Rademaker and T.J. Etherington, “Integration of Imaging Sensor Data Into a Synthetic Vision Display”, Proceedings of the 23rd Digital Avionics Systems Conference, (2004).

8. T.B. Sheridan, Humans and Automation: System Design and Research Issues, John Wiley & Sons Inc., 2002. 9. C.E. Billings, Aviation automation: The search for a human centered approach, Erlbaum, Mahwah, New Jersey.

1997.

10. G.J.M. Koeners, E. Theunissen, and J. Tadema, “Creating a Simulation Based Evaluation Environment for RPV Manual Control Concepts”, Proceedings of AIAA MST Conference and Exhibit, (2006).

11. J. Tadema and E. Theunissen, “A display concept for UAV autoland monitoring: rationale, design and evaluation”, Proceedings of the 26th Digital Avionics Systems Conference, 5.B.4 (2007).

Cytaty

Powiązane dokumenty

Таблиця 5 Порівняння результатів діагностики за основними компонентами емоційної компетентності здобувачів (констатувальний

Ekskursy (Pism o św ięte Stare­ go Testam entu. W: Jezus Chrystus'jedyny Zbaw iciel świata wczoraj, dziś i na w ieki. Wró­ blew ski. Pa\jlbw a ek lezjologia

Ksiądz Jan No- wak już wcześniej (2009) opublikował znacznie wartościowszą pod względem po- znawczym biografi ę ks. Życie i działalność sługi Bożego ks.

In view of the limited information available to assess the reliability of inspection procedures for marine steel structures, a review of POD curves used in the aerospace industry

The comparison of vascular plant species composition of eutrophic deciduous forests in Białowieża Forest and Valday; all species absent in Valday.. and present in less than 1/6

Ustawa z 1 VIII 1909 o odpowiedzialności państwa oraz innych związków za naruszenie obowiązków urzędowych urzędników przy wykonywaniu władzy publicznej (Zbiór ustaw

The Base case, a batch reduction of OIP using immobilized biomass with an extraction and crystallization unit for product recovery and an ISPR process, consisting of a normal