Queensland University of Technology
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems Shane Degen B. Eng. (Hons) QUT
Australian Research Centre for Aerospace Automation Faculty of Built Environment & Engineering This thesis is prepared as partial fulfilment of the requirements for the Masters Degree. May 2011
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
This page is intentionally left blank.
Shane Degen
Page 2 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
I would like to dedicate the following thesis to Amanda, Lachlan, Elijah, Noah, Gabriella and Moses.
Shane Degen
Page 3 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Keywords Collision Avoidance; Unmanned Aircaft Systems; Unmanned Aerial Vehicles; Uninhabited Aerial Systems; UAS; UAV; Image-based Visual Servoing; Sense and Avoid; See and Avoid; Sense and Act; Obstacle Avoidance; Collision Risk; Guidance; Control; Gimballed Camera; Nonlinear Aircraft Control; Control and Simulation; MATLAB; Monte Carlo Simulation; Equivalent Level of Safety; ELOS; National Airspace System; Feature Based Manoeuvring; Position-based Avoidance; Intruder Alert; Bearings-Only Tracking.
Shane Degen
Page 4 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Abstract Approximately 20 years have passed now since the NTSB issued its original recommendation to expedite development, certification and production of low-cost proximity warning and conflict detection systems for general aviation [1]. While some systems are in place (TCAS [2]), ”see-and-avoid” remains the primary means of separation between light aircrafts sharing the national airspace. The requirement for a collision avoidance or sense-and-avoid capability onboard unmanned aircraft has been identified by leading government, industry and regulatory bodies as one of the most significant challenges facing the routine operation of unmanned aerial systems (UAS) in the national airspace system (NAS) [3, 4]. In this thesis, we propose and develop a novel image-based collision avoidance system to detect and avoid an upcoming conflict scenario (with an intruder) without first estimating or filtering range. The proposed collision avoidance system (CAS) uses relative bearing and angular-area subtended , estimated from an image, to form a test statistic C AS . This test statistic is used in a thresholding technique to decide if a conflict scenario is imminent. If deemed necessary, the system will command the aircraft to perform a manoeuvre based on and constrained by the CAS sensor field-of-view. Through the use of a simulation environment where the UAS is mathematically modelled and a flight controller developed, we show that using Monte Carlo simulations a probability of a Mid Air Collision (MAC) RRMAC or a Near Mid Air Collision (NMAC) RiskRatio can be estimated. We also show the performance gain this system has over a simplified version (bearings-only ). This performance gain is demonstrated in the form of a standard operating characteristic curve. Finally, it is shown that the proposed CAS performs at a level comparable to current manned aviations equivalent level of safety (ELOS) expectations for Class E airspace. In some cases, the CAS may be oversensitive in manoeuvring the owncraft
Shane Degen
Page 5 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
when not necessary, but this constitutes a more conservative and therefore safer, flying procedures in most instances.
Shane Degen
Page 6 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Table of Contents KEYWORDS
4
ABSTRACT
5
TABLE OF CONTENTS
7
TABLE OF FIGURES
11
ACRONYMS
15
NOMENCLATURE
19
STATEMENT OF AUTHORSHIP
27
ACKNOWLEDGEMENTS
29
1 INTRODUCTION
31
1.1
Motivation
31
1.2
Collision Avoidance Problem
32
1.2.1
Definitions in the collision avoidance problem
32
1.2.2
Definitions
34
1.2.3
Types of sensors
35
1.3
Research Objectives
36
1.4
Significance
37
1.5
Research Contributions
38
Shane Degen
Page 7 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems 1.6
Research Methodology
38
1.7
Publications
39
1.8
Content
40
2 BACKGROUND
41
2.1
Collision Avoidance Systems: Industry Developments
41
2.1.1
NASA ERAST
41
2.1.2
Auto ACAS
42
2.1.3
Northrop Grumman & AFRL
42
2.1.4
Discussion
44
2.2
Collision Avoidance Strategies
44
2.2.1
Optimal Strategies
44
2.2.2
Force Control Techniques
45
2.2.3
Geometric Approaches
47
2.2.4
Vision-based Obstacle Avoidance
48
2.3
Passive-only Collision Avoidance
2.3.1 2.4
Passive Ranging
Discussion
3 COLLISION AVOIDANCE 3.1
Collision Determination
49 51 52
53 53
3.1.1
Thresholding Technique for Collision Decision
54
3.1.2
Closest Point of Approach Distance
54
3.1.3
Time to Collision and Image Expansion
56
3.1.4
Collision Determination Algorithm
57
3.2
Avoidance Manoeuvre
59
3.2.1
Background
59
3.2.2
Relative-Bearing Based Manoeuvre
61
Shane Degen
Page 8 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
4 MODELLING & SIMULATION 4.1
UAS Model
63 63
4.1.1
Owncraft Coefficients
64
4.1.2
Atmospheric Model
65
4.1.3
Navigation Equations
66
4.2
UAS Controller
69
4.3
Camera Model
72
4.3.1
Configuration
72
4.3.2
Perspective Projection Model
73
4.4
Simulation Environment
77
4.4.1
The Vision System Emulator
77
4.4.2
The Conflict Scenario Emulator
77
4.4.3
The UAV Emulator
78
4.4.4
Simulator Adaptability
79
5 RESULTS AND ANALYSIS 5.1
Performance Analysis
81 81
5.1.1
Encounter Models
81
5.1.2
Performance Measures
82
5.2
Experiment Setup
88
5.2.1
Monte Carlo Simulations
88
5.2.2
Limitations and Assumptions
90
5.3
Results and Analysis
91
5.3.1
CAS Threshold Determination
91
5.3.2
Observations and Behavioural Patterns
92
5.3.3
Probabilistic Results
97
5.3.4
Performance Results
100
6 CONCLUSION
Shane Degen
105
Page 9 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
7 FUTURE RECOMMENDATIONS
107
8 APPENDICES
109
APPENDIX A
109
Data and aerodynamic coefficients of the Flamingo UAS [125]
109
Flamingo Limits
109
Inertial Data
110
Lift/Drag Data
110
Longitudinal Coefficients
110
Lateral Coefficients
111
Mach Coefficients
111
Control Coefficients
112
APPENDIX B
113
Flamingo Open-Loop Stability
113
Lateral Stability
113
Longitudinal Stability
116
APPENDIX C
119
Image Area Expansion
119
9 BIBLIOGRAPHY
Shane Degen
122
Page 10 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Table of Figures Figure 1 – Aviation layers of safety [24, 25] ....................................................... 33 Figure 2 – The sections of autonomous See-and-Avoid ....................................... 36 Figure 3 – Research developmental path ............................................................. 39 Figure 4 – Collision cone geometry [80] ............................................................. 47 Figure 5 – Collision avoidance system decision process ..................................... 53 Figure 6 – Geometry of a conflict scenario evolving over time .......................... 54 Figure 7 – Miss distance relationships ................................................................. 55 Figure 8 – Image plane characteristics ................................................................. 56 Figure 9 – Geometry of a conflict scenario .......................................................... 57 Figure 10 – Collision avoidance right-of-way sectors ......................................... 60 Figure 11 – Typical encounter scenarios ............................................................. 61 Figure 12 – Owncraft model used to define linear and angular variables ........... 63 Figure 13 – Aileron from heading and roll .......................................................... 69 Figure 14 – Rudder feed forward from sideslip (for coordinated turns) .............. 69 Figure 15 – Throttle for airspeed hold ................................................................. 70 Figure 16 – Elevator for altitude hold .................................................................. 70 Figure 17 – Two-camera perspective projection setup ........................................ 73 Figure 18 – Image of intruder as seen, without compensation, in the camera frame (top) and with motion compensation (bottom) as calculated. The units are wrt the focal length in millimetres. .................................................................................. 76 Figure 19 – IBCASE (simulator) architecture ..................................................... 78 Figure 20 – Example of a standard operating characteristics curve [111] ........... 83 Figure 21 – Possible outcomes for UAS with collision avoidance system [115] 84 Figure 22 – Random selection of intruder tracks encroaching owncraft ............. 90 Figure 23 – Distribution of min(CAS) for experiment 1 ....................................... 92
Shane Degen
Page 11 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Figure 24 – Example of scenario where an owncraft reaches new heading and immediately returns to the original heading.
This is an example of a single
manoeuvre. ................................................................................................................. 94 Figure 25 – CAS behaviour for first 60s of Figure 24 track. The CAS test statistic (red line) is between thresholds (±16) therefore a manoeuvre is made (3 secs). The CAS is maintained at the last stable reading (blue dotted line) during the manoeuvre. At the new heading, it is deemed safe to return to the original heading (20 secs), where the stable CAS is held (blue dotted line) until on original heading. ................. 94 Figure 26 – Example of scenario where owncraft maintains new heading until θ seconds before returning to original heading. This is an example of a maintained manoeuvre. ................................................................................................................. 95 Figure 27 – CAS behaviour for first 60s of Figure 26 track. The CAS test statistic (red line) is between thresholds (±16) therefore a manoeuvre is made (3 secs). The CAS is maintained at the last stable reading (blue dotted line) during the manoeuvre. At the new heading, it is still not safe to return to original heading (20 secs), so the current heading is maintained for ϴ time until another CAS reading decides it is safe to return to original heading (36 seconds).................................................................. 95 Figure 28 – Example of scenario where owncraft avoids and returns to original heading, however CAS threshold is violated a second time. This is an example of a repeated manoeuvre. .................................................................................................. 96 Figure 29 – CAS behaviour for first 70s of Figure 28 track.
An avoidance
manoeuvre is made at 3 secs and then the CAS decision returns the owncraft to the original heading (19 secs). When the owncraft has returned to the original heading a second manoeuvre is performed (36 secs) and returns again (50 secs). .................... 96 Figure 30 – False Positive distributions before and after CAS is implemented .. 97 Figure 31 – A selection of Correct Avoidances made using implemented algorithm. (a) top left – left intruder approach with maintained manoeuvre (b) top right – right intruder approach with single manoeuvre (c) middle left – left intruder approach with single manoeuvre (d) middle right – right intruder approach with single manoeuvre
(e) & (f) bottom – right intruder approach with repeated
manoeuvre. ................................................................................................................. 98
Shane Degen
Page 12 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Figure 32 – Another selection of Correct Avoidances made using implemented algorithm. (a) top left – left intruder approach with maintained manoeuvre (b) top right – right intruder approach with single manoeuvre (c) & (d) middle – right intruder approach with single manoeuvre (e) & (f) bottom – right intruder approach with repeated manoeuvre. .......................................................................................... 99 Figure 33 – Failed avoidance detection or manoeuvres according to TABLE V and Figure 21. (a) top right – Missed Detection (b) top left – Late Alert (c) bottom left – Late Alert (d) bottom right – Late Alert on a repeated manoeuvre. .............. 100 Figure 34 – Standard Operating Characteristics (SOC) curve for CAS............. 101 Figure 35 – SOC curve that displays original PCD and PFM ............................... 102 Figure 36 – Risk Ratio results for CAS ............................................................. 103
Shane Degen
Page 13 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
This page is intentionally left blank.
Shane Degen
Page 14 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Acronyms ACAS
Airborne Collision Avoidance System (European term)
AFRL
Air Force Research Laboratory (USA)
ARCAA
Australian Research Centre for Aerospace Automation
ASTM
American Society for Testing and Materials
ATC
Air Traffic Control
BOT
Bearings-Only Tracking
CA
Correct Avoidance
CAS
Collision Avoidance System
CASA
Civil Aviation Safety Authority
CD
Correct Detection
CNA
Conflict with No Action
CPA
Closest Point of Approach
CoG
Centre of Gravity
DRA
Defense Research Associates
EKF
Extended Kalman Filter
ELOS
Equivalent Level of Safety
ERAST
Environmental Research Aircraft and Sensor Technology
EO
Electro-optical
FA
False Alarm
FAA
Federal Aviation Administration (UAS)
FMV
Försvarets Materielverk (Swedish Defence Materiel Admin.)
Shane Degen
Page 15 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
FOV
Field of View
FM
False Manoeuvre
HALE
High Altitude/Long Endurance UAV
IBCASE
Image Based Collision Avoidance Simulation Environment
ICAO
International Civil Aviation Organization
IC
Induced Conflict
IEEE
Institute of Electrical and Electronics Engineers
KTAS
Knots True Air Speed
LA
Late Alert
LOS
Line of Sight
MAC
Mid Air Collision
MATLAB
Matrix Laboratory (computer program)
MD
Missed Detection
MILP
Mixed Integer Linear Programming
MPC
Model Predictive Control
NAS
National Airspace System
NASA
National Aerospace and Space Administration
NATO
North Atlantic Treaty Organization
NED
North-East-Down (coordinate frame)
NMAC
Near Mid Air Collision
NMI
Nautical Miles
NTSB
National Transportation Safety Board (USA)
PaRCA
Passive Ranging Collision Avoidance
PID
Proportional-Integral-Derivative controller
Shane Degen
Page 16 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
PN
Proportional Navigation
POI
Point of Impact
POMDP
Partially Observable Markov Decision Processes
PR
Proper Rejection
QUT
Queensland University of Technology
RR
Risk Ratio
RTCA
Radio Technical Commission for Aeronautics
SA
Successful Alert
TCAS
Traffic Collision Avoidance System
UA
Unnecessary Alert
UAS
Unmanned Aircraft System
UAV
Unmanned Aerial Vehicle
Shane Degen
Page 17 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
This page is intentionally left blank.
Shane Degen
Page 18 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Nomenclature I
Aexp
Frontal cross-sectional area of intruder (m2)
b
Owncraft wing span (m)
C AS
Test statistic for collision avoidance
CD
Drag coefficient total
CL
Lift Coefficient total
Cl
Rolling moment coefficient
Cm
Pitching moment coefficient
Cn
Yawing moment coefficient
CX
X body-axis coefficient
CY
Y body-axis coefficient
CZ
Z body-axis coefficient
c19
Moment equation coefficients listed in Equation (4.13)
c
Owncraft mean wing chord (m)
Del
Derivative gain for altitude from elevator outer control loop
d
Intruder size (a priori) (m)
EMAC
Expected number of MACS (MACs/hr)
ENMAC
Expected number of NMACS (NMACs/hr)
f
Focal length of cameras (m)
fov
Field of view for a single camera (˚)
Shane Degen
Page 19 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
g
Gravity constant (m/s2)
h
Altitude - Z earth-axis position of the owncraft, NED (m)
hC
Altitude commanded (m)
he
Altitude error (m)
heng
Engine angular momentum about x-axis (N∙m∙s)
IX
X body-axis moment of inertia (N∙m)
I ZX
Z-X body-axis product of inertia (N∙m)
IY
Y body-axis moment of inertia (N∙m)
IZ
Z body-axis moment of inertia (N∙m)
I el
Integral gain for altitude from elevator outer control loop
I th
Integral gain on throttle from speed hold loop
kk
Induced drag non-dimensional coefficient
k
Time instance
L
X body-axis aerodynamic moment component (N∙m)
Lapse
Lapse rate, of temperature with height (˚K/m)
M
Y body-axis aerodynamic moment component (N∙m)
M air
Molar mass of air (kg/mol)
m
Owncraft mass (kg)
N
Z body-axis aerodynamic moment component (N∙m)
n
Camera number where n [1, 2]
I
n
P
Shane Degen
Integer number Air pressure (Pascals)
Page 20 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
P0
Air pressure at sea level (Pascals)
P
Proportional gain for heading hold outer control loop
Pail
Proportional gain for aileron from roll control loop
Pel
Proportional gain for altitude from elevator outer control loop
Pth
Proportional gain on throttle from speed hold loop
Pff rud
Proportional feed forward gain for coordinated turn of rudder from sideslip
PAlert
Probability of an alert being issued (hr-1)
PCon
Probability of a entering conflict scenario (hr-1)
PCD
Probability of a Correct Detection (hr-1)
PCNA
Probability of a Conflict occurring if No Action (manoeuvre) is taken (hr-1)
PFatality
Probability of a fatality occurring (hr-1)
PFM
Probability of a False Manoeuvre
PMAC
Probability of a MAC occurring (hr-1)
PMACwithCAS Probability that a MAC occurs whilst a CAS is being used (hr-1) PMACwoCAS Probability that a MAC occurs where no CAS is being used (hr-1) PNMAC
Probability of a NMAC occurring (hr-1)
PNMACwithCAS Probability that a NMAC occurs whilst a CAS is being used (hr-1)
PNMACwoCAS Probability that a NMAC occurs where no CAS is being used (hr-1) PSA
Shane Degen
Probability of a Satisfactory Alert (hr-1)
Page 21 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
PUA
Probability of an Unnecessary Alert (hr-1)
I
Pimn
Intruder pixel location in the image of camera n where n [1, 2]
I
Pc
Intruder pixel location in the joined camera plane
I
Pb
Intruder pixel location in the body-axis of the owncraft
I
PN
Intruder pixel location in the earth-axis wrt owncraft
I
Pcomp
Motion compensated pixel location of intruder wrt owncraft
p
X body-axis angular velocity component (rad/s)
q
Y body-axis angular velocity component (rad/s)
q
Dynamic Pressure (Pa)
R
Range from owncraft to intruder (m)
Rg
Ideal gas constant (J/mol∙˚K)
Rk
Range from owncraft to intruder at time k (m)
RiskRatio Probability that a NMAC will occur with a CAS RRIC
Induced Conflict component of Risk Ratio
RRMAC
Probability that a MAC will occur with a CAS
RRunresolved Unresolved risk component of Risk Ratio
r
Z body-axis angular velocity component (rad/s)
S
Owncraft wing area (m2)
Sk
Distance from owncraft to Point of Impact (POI) at time k (m)
I
Distance intruder travels in time I t (m)
s
T
Shane Degen
Engine Thrust (Newtons)
Page 22 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
T0
Temperature at sea level (˚Kelvin)
Tk
Perpendicular distance from intruder to POI at time k (m)
TTC
Time to collision (s)
b
Tc
Cameras position in owncraft body axis (m)
t
Time it takes intruder to travel distance I s (s)
I
u
X body-axis velocity of owncraft (m/s)
un
X image-axis pixel location of intruder in camera n
Vol
Volume of airspace in encounter scenario (m3)
Vt
Velocity of the owncraft in the air (m/s)
Vt C
Velocity commanded (m/s)
Vt e
Velocity error (m/s)
v
Y body-axis velocity of owncraft (m/s)
w
Z body-axis velocity of owncraft (m/s)
X
X body-axis aerodynamic force component (Newtons)
xb
X body-axis position (m)
xc
X camera-axis pixel location of the intruder
xe
X earth-axis position of the owncraft, NED (m)
ximn
X image-axis pixel location of intruder of camera n
xcomp
X motion-compensated-axis pixel location of intruder
xN
X earth-axis position of intruder wrt owncraft (m)
Y
Y body-axis aerodynamic force component (Newtons)
Shane Degen
Page 23 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Ym
Closest point of approach distance of two aircraft (m)
Ylower
Lower bound of random Ym generated (m)
Yupper
Lower bound of random Ym generated (m)
yb
Y body-axis position (m)
yc
Y camera-axis pixel location of the intruder
ye
Y earth-axis position of the owncraft, NED (m)
yimn
Y image-axis pixel location of intruder of camera n
ycomp
Y motion-compensated-axis pixel location of intruder
yN
Y earth-axis position of intruder wrt owncraft (m)
Z
Z body-axis aerodynamic force component (Newtons)
zb
Z body-axis position (m)
zN
Z earth-axis position (altitude) of intruder wrt owncraft (m)
Angle of attack (rad)
Angle of sideslip (rad)
ail
Aileron control surface deflection (1 a 1)
el
Elevator control surface deflection (1 e 1)
rud
Rudder control surface deflection (1 r 1)
th
Throttle deflection (0 th 1)
X body-axis position of camera (m)
Y body-axis position of camera (m)
Roll - Euler angle of owncraft (rad)
Shane Degen
Page 24 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
C
Roll command (˚)
e
Roll error (rad)
Z body-axis position of camera (m)
Relative bearing of intruder wrt owncraft (˚)
k
Relative bearing of intruder at time k (˚)
Relative bearing rate of intruder wrt owncraft (˚/k)
/ t
Relative bearing rate of intruder wrt owncraft (˚/k)
Relative elevation of intruder wrt owncraft (˚)
spiral
Spiral roll characteristic root
roll
Roll characteristic root
Angle subtended by intruder in owncraft‟s image sensor (˚)
k
Angle subtended by the intruder at time k (˚)
Angle subtended rate (image expansion, 1D) (˚/k)
t
Angle subtended rate (image expansion, 1D) (˚/k)
I
e
Intruders position in the earth-axes (m)
I
N
Intruders position in the earth-axes wrt owncraft, NED (m)
e
Owncraft‟s position in the earth-axes - ( xe , ye , h) (m)
O
e
Owncraft‟s attitude – () (rad)
Heading (Yaw) - Euler angle of owncraft (rad)
Ck
Heading command at time k (˚)
e
Heading error (rad)
Shane Degen
Page 25 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
max
Maximum heading rate change (˚/s)
Air density (kg/m3)
Angle of heading alteration during manoeuvre (˚)
roll
Roll time constant (s)
Time for a heading change of to take place (s)
Pitch - Euler angle of owncraft (rad)
Half the fov of the camera (˚)
bc
Rotation of the camera wrt body-axis (rad)
be
Rotation of the body-axis wrt earth-axis aka e (rad)
nDR
Dutch Roll undamped natural frequency (rad/s)
nP
Phugoid undamped natural frequency (rad/s)
nSP
Short-period undamped natural frequency (rad/s)
x
Rotation about the x-axis (rad)
y
Rotation about the y-axis (rad)
z
Rotation about the z-axis (rad)
Angular area subtended by the intruder (2D) (˚2)
t
Angular area subtended rate (image area expansion, 2D) (˚2/k)
Angular area subtended rate (image area expansion, 2D) (˚2/k)
DR
Dutch Roll damping ratio
P
Phugoid damping ratio
SP
Short Period damping ratio
Shane Degen
Page 26 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Statement of Authorship
The work contained in this thesis has not been previously submitted to meet requirements for an award at this or any other higher education institution. To the best of my knowledge and belief, the thesis contains no material previously published or written by another person except where due reference is made.
Signature: ______________________________________________________
Date: __________________________________________________________
Shane Degen
Page 27 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
This page is intentionally left blank.
Shane Degen
Page 28 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Acknowledgements This research was supported and funded by the Australian Research Council, Australian Postgraduate Award, Faculty of Built Environment & Engineering, Queensland University of Technology (QUT) Vice-Chancellor and Queensland Government‟s Smart Skies Project. I would like to express thanks to primary supervisor Dr Luis Mejias-Alvarez for his guidance, also Dr Jason Ford for his support as an associate supervisor. Thanks also to Prof. Rodney Walker for his motivation and patience. I would also like to thank fellow researchers within the Australian Research Centre for Aerospace Automation for all your assistance, patience and friendships. I would like to express my thanks to my wife Amanda and our children Lachlan, Elijah, Noah and Gabriella for your awesome support throughout this period, without which, this very well could not have been possible. Most of all I would like to thank the Lord Jesus Christ for the opportunity, ideas, help, answers and inspiration for the following work.
Shane Degen
Page 29 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
This page is intentionally left blank.
Shane Degen
Page 30 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
1 Introduction In this section, we introduce the motivation that drives this research; present the problem definition; describe the overall objective and goals of this thesis; discuss its significance; detail the novel contributions made; describe the adopted research methodology; list the publications; and finally, detail the thesis format.
1.1 Motivation Recently Australia celebrated a centenary of flight, which began with Harry Houdini's pioneering flight at Diggers Rest, Victoria on March 16th, 1910 [5]. In the last 100 years, aviation has gone through considerable technological advancements. Currently, the aerospace industry is increasing the trend towards automation, replacing pilot functions with automated avionics. New systems are emerging that use an increased level of automation; these are called Unmanned Aircraft Systems (UAS). UAS origins can be traced back to 1914, when Elmer Sperry demonstrated his gyro-stabilized Curtiss seaplane in a French airplane safety contest [6]. In the last decade or two, the ideas for pilotless plane operations have grown. The applications and scenarios for UAS utilisation are expanding as industry is becoming more aware of the functionality and capability of these autonomous vehicles. Today UAS are the fastest growing sector in Aerospace [7]. Sales over the next decade is projected to grow from $4.4 billion annually to $8.7 billion, with more than $62 billion being spent [7]. For growth to continue on this scale, UAS operations need to expand beyond controlled airspace and operate freely within the national airspace system (NAS) [8]. NAS integration requires that UAS are capable of performing at an equivalent level of safety (ELOS) to that of manned aircraft [9-11]. A capability manned aircraft have that UAS will require is see-and-avoid [8]. See-and-avoid technology is also referred to as collision avoidance or sense-and-avoid.
Shane Degen
Page 31 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Collision avoidance has been identified as one of the top priority technological enablers of UAS into the NAS by various international organisations. United States Office of the Secretary of Defence have recognised it as the second biggest matter [3].
The Joint Air Power Competence Centre (NATO) has identified collision
avoidance as one of the top 26 needed capabilities facing UAS [12]. Also regulatory bodies, Federal Aviation Administration (FAA) [8] and Eurocontrol [13] have recognised the high priority of the collision avoidance issue. Various consortiums, committees, studies and reports have been created to address collision avoidance [14, 15]. Standard specifications have even been drawn up by ASTM (American Society for Testing and Materials) for a sense-and-avoid system design and its performance requirements [16]. RTCA (Radio Technical Commission for Aeronautics) are also expecting to have standards by 2011 [10].
1.2 Collision Avoidance Problem 1.2.1 Definitions in the collision avoidance problem In the literature, there are a few distinctively different research areas, under the name of collision avoidance [17-19]. In this thesis, we have used distinct definitions to break this problem into the various categories. The first distinction made is between UAS avoiding collisions with terrain/static obstacles as opposed to other air traffic. In this thesis, Obstacle avoidance is defined as avoiding collisions with terrain or static obstacles.
Obstacle avoidance is not considered in this thesis, although a comparison is addressed later in the literature review. Two main categories of collision avoidance can be identified:
Shane Degen
Page 32 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Cooperative collision avoidance, is where two aircraft that are in communication with one another, negotiate a mitigation strategy.
This can happen distributively [20, 21] or with a central manager (separation management) [22, 23]. On the other hand: Non-cooperative collision avoidance is where the onus is solely on each individual owncraft to find a way to avoid the conflict scenario.
Non-cooperative collision avoidance is used as a safety backup in the event that separation management fails or in case the aircraft are not in communication with one another [24, 25]. The various levels of collision avoidance are shown in Figure 1.
Levels 1-4 are considered separation management or cooperative collision
avoidance and level 5 is the non-cooperative collision avoidance [24, 26]. Noncooperative is the type of collision avoidance that governmental institutions and regulatory bodies have identified as the major technological enabler for UAS to operate freely in the NAS.
Level 5 - See & Avoid Level 4 - TCAS/CDTI – ACAS Level 3 - Radar Separation Services Level 2 - Strategic Sep. Services Level 1 - Airspace Structures
Figure 1 – Aviation layers of safety [24, 25]
Shane Degen
Page 33 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
1.2.2 Definitions In this thesis, we make use of standard definitions from TCAS Minimum Operations Performance Standards [27], but slightly redefined to accommodate the context of our problem. The following definitions apply to our problem:
Near mid air collision (NMAC) occurs when two aircraft come within 500 feet horizontally, which is 152.4 metres (also 100 feet vertically but that component is ignored in this work) [27] .
Conflict scenario is defined for this research to be ‘an encounter scenario between two aircraft whereby the aircraft come within 152.4m of each other laterally. This would result in a NMAC being filed’.
Collision scenario is defined as ‘an encounter scenario whereby two aircraft will collide with one another if an avoidance manoeuvre is not made’.
Mid air collision (MAC) occurs when two aircraft collide with one another.
In the context of this research, we define:
A mid air collision (MAC) as an ‘encounter scenario that would lead to the two aircraft coming within 32m of one another’
This arises from realizing that a Boeing 747 (very large aircraft) has a wingspan of 60m, we assume that our vehicle has a wingspan of 4m (which is a typical midsize
Shane Degen
Page 34 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
UAS), thus 32m1 seems a reasonable and even conservative figure. Typically, there is no distance that defines two aircraft coming into contact with one another. Because aircraft have volume and are not particles, this constant is arbitrarily approximated.
This is actually a reoccurring problem in CAS performance
evaluation [28].
1.2.3 Types of sensors Of particular importance to non-cooperative collision avoidance is the methodology used to acquire situational awareness of the environment in which the owncraft is operating. There are generally two types of methods used to acquire situational awareness; those involving either active or passive sensors. Active sensors emit radiation and wait for a reflected signal in order to acquire situational awareness [29]. This radiation is mostly radar for the UAS application, but infrared, laser, ultrasonic can also be used. Active sensors are generally heavy, larger, expensive, power-demanding and have large bandwidth requirements [29, 30]. As such, active sensors are often implemented on larger, more-expensive UAS [31]. On the other hand, passive sensors acquire information from natural emissions and reflections [29]. Passive sensors are normally electro-optical (EO), both infrared and vision-based. In contrast to active sensors, passive sensors are typically much cheaper, smaller, lighter and more power-efficient.
They provide good bearing
information [26] and are easily interfaced with processors for research-friendly computational analysis [32].
The main advantage is that passive sensors, in
particular electro-optical (EO) sensors, are much cheaper, generally lighter and less power demanding than active sensors [29]. Thus, passive sensors are ideal for lowcost UAS [29]. However, special attention should be placed in atmospheric effects since it greatly affects the performance and quality of data [29]. Another limitation of passive sensors is that no range information is directly observed [26].
1
Half wing span of each aircraft added.
Shane Degen
Page 35 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
1.3 Research Objectives This research endeavours to use a passive sensor to address the non-cooperative collision avoidance problem. Thus the principle research objective driving this thesis is: To investigate the best method to declare and avoid a conflict scenario using image-based data.
Within this principle objective, the two major goals of this research are: To identify and use the most relevant image-based features to determine whether a conflict scenario will take place. Upon determining that a threat is likely, use image-based features to manoeuvre the owncraft to avoid colliding with an intruder.
Intruder Detection
Collision Determination
Avoidance Manoeuvre
Figure 2 – The sections of autonomous See-and-Avoid
Note that this research does not investigate the problem of detecting an intruder, but only determining if a collision is likely given the intruders behaviour in the image. The intruder detection is assumed to have taken place a priori and is outside the scope of this research. This is illustrated in Figure 2. Intruder detection have been investigated and addressed in other research [33].
Shane Degen
Page 36 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
1.4 Significance As previously discussed, the collision avoidance problem is one of the major hurdles to allowing UAS to operate freely in the NAS and thus ensure continued UAS market growth.
Accordingly, various major players within industry have
attempted to solve the collision avoidance problem (discussed in Section 2.1). The most capable proposal put forth from industry was proposed by Chen [31]. It uses a heavy and power-demanding sensor that costs approximately $200k. Such a system would seem unreasonable for low-cost UAS. A collision avoidance system using vision-only sensors would present a solution for the low-cost UAS market and be a major technological enabler for the entire UAS sector [29]. The collision avoidance algorithm presented in this research uses vision-only sensors and can be implemented on general purpose hardware (costing< $5k). The significance this particular research has over other vision-only collision avoidance algorithms, is that it: Triggers an avoidance manoeuvre earlier than the range-estimate dependant techniques such as the ones presented in [25, 30, 34]. This research is able to react within three camera frames (typically 0.12 seconds) after intruder detection. In addition, this research does not require range estimate in order to decide to manoeuvre, unlike [25, 34, 35]. Use a less comprehensive set of scenarios to obtain a performance similar to Kochenderfer et al. [36] which implements partially observable Markov decision processes on bearings-only data for collision determination. However, it should be compared in the appropriate context given the considerable number of scenarios addressed by Kochenderfer et al.
Shane Degen
Page 37 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
1.5 Research Contributions The primary contribution of this thesis is in the design of a collision avoidance system (CAS) that uses vision-only features to predict whether a collision is likely and to decide how to manoeuvre. The algorithm is novel as it performs collision avoidance without using any position-based information; it is performed using only image-based information.
Because of this feature, it is able to react almost
immediately; this is orders of magnitude faster than other systems [25, 30, 34] (see Section 3.1). As a derived contribution this research propose an avoidance manoeuvre algorithm based on a relative-bearing control law (see section 3.2). This control approach is consistent with see-and-avoid recommendations. This research also contributes with an EO sensor model that combines two standard EO sensors to achieve wide field of view (see Section 4.3). This sensor is motion compensated accounting for the platform manoeuvres, with the advantage of keeping the target in the field of view of the sensor during platform manoeuvring. Finally, this thesis contributes with a comprehensive set of validations based on Monte-Carlo simulations. This work provides using encounter models, performance metrics that can be comparable with current aviation practices. Performance is shown using standard operating curves.
1.6 Research Methodology The research methodology followed in this thesis is illustrated in Figure 3. First, a literature survey in collision avoidance has been performed, the findings of which are presented in Section 2.
Next, the problem is analysed and developed
geometrically, as shown in Section 3.
Then a collision avoidance simulation
environment is developed using a model of the Flamingo UAS [37] and the collision avoidance sensor model.
To validate the proposed collision determination and
avoidance manoeuvre system, comprehensive Monte Carlo simulations with random
Shane Degen
Page 38 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
two-minute conflict scenarios are developed. The results are used to iteratively develop and refine the original algorithm. The final algorithms‟ results are detailed and analysed in Section 0.
Mathematical Development
Simulation & Analysis
Literature Survey Figure 3 – Research developmental path
1.7 Publications There were two publications produced during this period of study. They are as follows:
Conference Paper S. C. Degen, L. Alvarez, J. J. Ford, and R. Walker, "Tensor field guidance for time-based waypoint arrival of UAVs by 4D trajectory generation," in Proceedings of the IEEE Aerospace Conference, Big Sky, Montana, 2009.
Submitted Journal Paper S. C. Degen and L. M. Alvarez, "A reactive image-based collision avoidance system for Unmanned aircraft systems," IEEE Transactions on Aerospace and Electronic Systems, 2011 (submitted).
Shane Degen
Page 39 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
1.8 Content Thus far, we have discussed the motivation and rationale for the following research. In addition, we have described the problem and its significance. This thesis is structured as follows: Section 2 investigates the background of collision avoidance in literature, both from industry and academia. Then we discuss passive-only collision avoidance and positioning of this thesis within the overall collision avoidance field. Section 3 investigates the characteristics of an image that are pertinent to conflict scenarios and then develops the IBCA technique that is used for detecting the collision. It goes on to show the adopted method for manoeuvring the owncraft around the intruder, once collision is detected. Section 0 shows the mathematical modelling of the UAS and its controller. It mathematically models the camera setup, which is the collision avoidance sensor, and the controller that is used in the simulation. Finally, this section discusses the simulation environment developed for the testing phase. Section 0 discusses the performance metrics used to assess the safety of the proposed CAS. Then it describes the setup of the experiment and its limitations. Finally, it shows the results and discusses the implications. The conclusions of this thesis are detailed in Section 6, with recommendations for future work made in Section 7.
Shane Degen
Page 40 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
2 Background This chapter surveys existing literature in the collision avoidance domain. Firstly, the efforts of the aerospace industry are reviewed because they have been working on non-cooperative collision avoidance for decades. Next, this section surveys the academic literature examining at passive obstacle avoidance and non-passive collision avoidance. Finally, we investigate the research specifically in the field of passive-only collision avoidance.
2.1 Collision
Avoidance
Systems:
Industry Developments 2.1.1 NASA ERAST In March 2003, National Aerospace and Space Administration (NASA) Environmental Research Aircraft and Sensor Technology (ERAST) program flew twenty-two conflict scenarios using a 35 GHz radar sensor [38]. The concept of operation is for the collision detection system to provide situational awareness to a human-in-the-loop who is responsible for performing the actual manoeuvre. This program flew thirteen encounter scenarios with a single intruder aircraft and another nine scenarios with two intruders approaching at the same time. ERAST found that the pilots when unassisted would detect intruding aircraft at a distance of approximately 1 1.5NMI (nautical miles) away. ERAST discovered any speed greater than 300KTAS (knots true airspeed) and detection distance less than 4 5NMI , is difficult for the pilot to comfortably avoid. This is attributed to the human-in-the-loop, thus with an autonomous controller, the detection distance would have provided plenty of time for a manoeuvre. This aligns with the findings of Graham and Orr [39]. Overall, the collision detection system is deemed to
Shane Degen
Page 41 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
provide acceptable situational awareness by the pilots, except for a head on conflict scenario where vrel 600KTAS (using a FA-18). Where vrel is the relative velocity between the two approaching aircraft.
2.1.2 Auto ACAS In 2002, Boeing, Lockheed Martin, Air Force Research Laboratories (AFRL), Saab and FMV (Swedish Air Force) developed the Auto-ACAS (Airborne Collision Avoidance System) for military aircraft that communicates along an established data link [40]. It is able to negotiate conflict scenarios and has tested relative velocities up to vrel 860KTAS . This system predicts and transmits the owncraft‟s trajectory into the future (5-10 secs). It compares the owncraft‟s trajectory predication against all other aircraft transmitting their trajectory predictions. An avoidance manoeuvre is carried out cooperatively should the risk exceed a threshold. Optimal control is used to roll the aircraft away from one another, attempting to maintain a closest point of approach (CPA) of 100m. Researchers found a violation of the CPA for one conflict scenario where the CPA went as low as 80m [41].
2.1.3 Northrop Grumman & AFRL In 2005, Northrop Grumman developed the passive ranging concept [25]. This concept involves a manoeuvring owncraft acquiring a range estimate of the intruder using a triangulation technique.
In particular, for Northrop Grumman, this
manoeuvre involved a climb. Northrop Grummans‟ research uses passive sensors and gets a range estimate to converge to 5-7% error (approximately) within 7 seconds. This convergence occurs at approximately 120ft of the 500ft, 20-second climb. The climbing manoeuvre works well because the best motion for intruder range estimation is one perpendicular to intruder line of sight (LOS) [42]. AFRL and Defense Research Associates (DRA) Detect And Avoid program used an electro-optical (EO) sensor that had ~0.5 milli-radians (mRad) resolution [43]. In simulation, this system detects the intruder at approximately 4NMI with near 100% confidence and a 0.05% false detection rate. However, during flight testing there
Shane Degen
Page 42 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
were thousands of false tracks [44], of which, 68% were eliminated by tracking for around 20 frames. In 2005, AFRL and Northrop Grumman developed a see-and-avoid sensor suite under the SeFAR program [45]. Later in 2007, Northrop Grumman and AFRL-DRA collaborated on a series of flight trials with the sensor suite [46]. A Lear jet was used as a surrogate High Altitude/Long Endurance (HALE) UAS for these trials. In addition, the Northrop Grumman electro-optical sensor was combined with the traffic collision avoidance system (TCAS).
Twenty-seven different conflict scenario
geometries were flown with two (human piloted) intruder aircraft. Results from the flight trial deemed the collision avoidance system (CAS) as successful, but admit there is still much work to do. Currently, improvements are being made reducing the range at which the intruder is detected and reducing the number of false tracks. It was found that long-wave infrared cameras performed little, if no better than electro optical (EO) sensors [46].
The PaRCA (Passive Ranging Collision Avoidance)
avoidance algorithm performed „well‟ (CPA distance exceeded 762m) except for the head on case when the detection range was only 1.5NMI . PaRCA is an evolutionary algorithm by Shakernia et al. [25] that pre-empts the avoidance manoeuvre before executing it. It then calculates the range during the execution of this avoidance manoeuvre, which is later incorporated in the controller. Thus, the owncraft will not have to do a reversal if the passive ranging manoeuvre increases the collision risk. PaRCA considers other constraints, like camera field of view (FOV), owncraft dynamic limits, air traffic control (ATC) corridors etc. In May 2008, the third evolution of the AFRL/Northrop Grumman programs flew [31]. Radar, Airborne Dependant Surveillance Broadcast and TCAS were added to the EO sensor and it became known as the Multiple Intruder Autonomous Avoidance sensor. The ICV AI-130 radar was used for this series of flight trials. This system flew many different geometries, with up to two intruders, and was deemed successful. The pilot participants remarked that, “This is how a pilot would have done it.” The fundamental problem with the AFRL approach for low-cost UAS is
Shane Degen
Page 43 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
the price, size, complexity, weight and power required for the sensor suite. The evolved sensor suite now costs nearly $200k.
2.1.4 Discussion The Auto-ACAS system [38, 39] has a dedicated communications channel, making this a cooperative technology. The program showed almost no success but helped others to understand the difficulty of the problem. The NASA ERAST program [40, 41] involves radar, which is not a passive sensor. Even with this expensive and very capable sensor, difficulties still arose in some scenarios. However, the CAS was deemed a general success, just not in extreme situations. Both of these programs are interesting to note because they work on the collision avoidance program, but they are very different to the research of this thesis in that they are either cooperative or non-passive. The joint Northrop Grumman and AFRL program [25, 31, 42-46] started out using a passive sensor but has very recently become a non-passive sensor due to the difficulties involved in using a passive-only sensor. Therefore, this program is not directly related to the work of this thesis.
2.2 Collision Avoidance Strategies 2.2.1 Optimal Strategies Optimal control techniques for collision avoidance close the control loop and make use of dynamic programming type approaches for avoiding obstacles [47]. Receding Horizon Control [48] (also known as Model Predictive Control (MPC) [49]), which emerged from the field of chemical process control [50], has been adapted for UAS nonlinear control. MPC closes the loop of open-loop optimal control variables, at each time step, and incorporates the new environment variables, which in the case of collision avoidance, is the intruder.
Shane Degen
Page 44 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
The first method considered here uses Mixed Integer Linear Programming (MILP) with MPC strategies. This approach is first proposed by Pallottino et al. [51] and also by Richards and How [47] to avoid vehicle collisions cooperatively. The decentralized version of this MILP with MPC algorithm is still cooperative[47]. It also requires a priori intruder position and uses cooperative techniques to acquire this information; although, just as simply one could have used active sensors for noncooperative collision avoidance. Similarly, the non-linear MPC method introduced by Shim and Kim [52] utilizes potential field methods (as discussed in Section 2.2.2) and uses a priori position information about the intruder obtained from active sensors, to safely navigate around obstacles. MPC actually was first implemented using passive sensors by Frew [53]. Frew derives the equations relevant to establishing the MPC and puts the target information in a Fisher Information Matrix that models the probability of the collision in an estimate covariance matrix. His approach uses live data to navigate through unknown environments with static obstacles. For the collision avoidance case, it uses a priori position and velocity information about the intruder. Thus, the dynamic collision avoidance problem is unaddressed. In the next version of this approach, Frew [48] includes adaptive control in the MPC and builds upon this controller to form a global planner but still does not address the dynamic obstacle problem.
In later work, Frew [54] includes the
Unscented Transform on bearings-only information but on stationary targets (obstacles).
2.2.2 Force Control Techniques The notion of vector fields is similar to the idea of potential field methods, which is the pioneering work of Khatib [55]. The use of potential functions has continued to be one of the mainstream approaches to robotic task execution in the presence of obstacles [56, 57]. A comprehensive summary of techniques that address the classic geometric problem of constructing a collision-free path and traditional path-planning algorithms is provided by Latombe [58]. Furthermore, progressive improvements
Shane Degen
Page 45 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
made to the general potential field methods over the last two decades [59-61] continue today [57]. Potential field methods are force control techniques that navigate a robot by sensing the environment and mapping it according to physical equations that make that environment analogous to physical laws [62]. In the specific case of potential field methods, the analogous physical law is the electrostatic charge equation, where vehicle and/or obstacles and goal are treated as charges of opposite sign. Then the path is mapped by calculating the Coulomb forces between every point of the environment and the vehicle. The vehicle „falls‟ down the path of least resistance. There are other force control techniques that come from mobile robot navigation using physical equations for navigation; these are gaseous diffusion [63], Laplace‟s equations [64] or mechanical stress fields [65]. These and many others have been developed for the obstacle and collision avoidance problem over the decades. Potential field methods more prevalent to UAS are the impedance force model developed by Jang et al. [18]. Also, vector fields have been developed for UAS [6674] and used in commercial systems [75]. Vector fields have also been developed by Degen et al. [76]. Sigurd and How [77] develop a method called total field collision avoidance, for multiple UAS guidance and avoidance in a dense vehicle and obstacle environment, once again using active sensors. The free flight algorithm is another adaptation of potential field methods [20, 78]. It was developed by the RTCA in 1995 [79]. Aircraft repel one another according to a „voltage potential function‟ in order to achieve a minimum closest point of approach distance (CPA). All these techniques require the relative position of the intruder to be known or calculated.
This can be done with vision-based sensors but implies extreme
complexity and is subject to calibration errors, otherwise it requires active sensors or to be done cooperatively.
Shane Degen
Page 46 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
2.2.3 Geometric Approaches 2.2.3.1 Collision cones The collision cone approach was first presented by Chakravarthy and Ghose [80]. Based on the geometry of Figure 4, a collision is avoided if the aircraft meets the following conditions: 1) R RP 2) rel [0 , f ] (outside of the cone)
rel 2 3) rel 2
v i
RP
R
P
R R θ0
θf θ
-vi vo
ψrel -vrel
Figure 4 – Collision cone geometry [80]
The 3D version of this algorithm is presented by Watanabe et al. [81]. It uses the active sensor (radar) planar implementation of Kumar and Ghose [82], but still does not address the fundamental issue of dynamic obstacles without range
Shane Degen
Page 47 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
information. Han and Bang [19] couple these with proportional navigation (PN) strategies. This algorithm is also flown in the Carbone et al. [83] UAS.
2.2.3.2 Bearings-only tracking Bearings-only tracking (BOT) is a technique that uses the relative bearing information-only to filter the intruders range estimations. The range estimates allow calculation of avoidance manoeuvres because positions of the relative parties are known. The methods that are used for range estimation include: extended Kalman filters (EKF) [84], posterior Cramer-Rao bounds [85] (that both Frew and How adopt in almost all of the previously mentioned MPC work), particle filters [86] and some for multi-target tracking [86, 87]. These have been adopted for modified polar coordinates with better results [88] and the particle filter initialization issues are addressed by Bréhard and Le Cadre [85]. It is demonstrated, that the particle filter implementation yields the best results [84, 86, 89] for BOT filtering. This area of BOT also covers the passive ranging concept that is discussed below in Section 2.3.1.
2.2.4 Vision-based Obstacle Avoidance The majority of the research in vision-based obstacle avoidance comes from the optical flow field. Green and Oh [17] investigate obstacle avoidance (as defined in Section 1.2.1.) using optical flow, by mounting one-pixel, 1-D, lightweight (4.8g) optical flow sensors at ±45˚ on an indoor plane. The UAV is shown to avoid collisions with a basketball net using rudder deflection only. Another system, proposed by Fasano [26], couples an EO sensor, using pure optical-flow based methods, with radar. This hybrid approach combines the higher positive hit rate, range information and all-weather performance of radar with the angular resolution of an EO sensor. Recchia et al. [90] look at an EO only implementation, but show that this system has many inherent limitations. This limitation comes from a requirement for a
Shane Degen
Page 48 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
stationary observer (for a moving target) or a stationary target (for a moving observer). Another vision-based technique for UAS is proposed by Watanabe et al. [81]. This algorithm uses optimal techniques to filter the position estimation of the stationary obstacle. It then pre-empts the next waypoint to get a more efficient flight path using minimum effort guidance. Call et al. [91] investigate using an EO sensor to avoid stationary obstacles. A corner-detecting image-processing algorithm is used to detect the obstacles. Next, a reactive guidance algorithm, known as vector fields [71, 74], is used for the avoidance manoeuvre. Griffiths et al. [72] use laser range finders (active sensors) to avoid stationary objects that are located straight ahead, and use optical flow sensors for navigating canyon corridors. Shelnutt [92] develops a method for negotiating obstacles using optical flow, similar to Griffiths method for navigating canyon corridors. It navigates between two obstacles (on either side) by nullifying the LOS rate difference.
This is
essentially the same as equalizing the optic flow on either side. It is similar to the work presented in this thesis in that it uses the features of the image (in this case the LOS rate) to navigate without first converting the image-based data into positionbased information.
2.3 Passive-only Collision Avoidance Angelov et al. [93] propose a passive method to estimate the risk of a collision, based on consecutive bearing measurements. Small changes in bearing indicate an increased risk of collision (see Equation (3.1)). This paper fails to address the decision aspect of this question i.e. what threshold of risk is deemed acceptable? It also does not address the avoidance manoeuvre nor produce any performance results. The methodology presented by Angelov et al. is closely related to that presented in this thesis. However, the unpublished figures for avoidance performance make it impossible to assess directly against the CAS presented here. Their performance
Shane Degen
Page 49 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
figures could be used to assess the collision determination/decision level of Section 5.3.3. Kochenderfer et al. [36] use relative bearing and Time To Collision TTC (extractable from angular-area subtended ) for hazard alerting. This is similar to the work presented in this thesis. It is shown that a bearings-only ( only) CAS that uses thresholding techniques, is rather ineffective. This type of CAS would have as many false alerts as successful alerts (see Section 5.1.2). They implement partially observable Markov decision processes (POMDP) to improve performance on a bearings-only CAS. The POMDP system dynamically updates the underlying state (impending collision) based on measurements (LOS rate) using Bayes‟ rule. An observation model is obtained from simulation data. An impending collision belief is updated from LOS measurements and compared against the model. The belief state is thresholded, wherein a decision about manoeuvring is made. The POMDP system (that implements Time To Collision TTC ) does obtain notably better performance than the only system. The POMDP CAS employs a comprehensive encounter model for simulation and testing, therefore, it cannot be directly compared against the CAS of this thesis. However, the POMDP CAS is compared to the only system to highlight performance gain. This thesis also employs this strategy for displaying performance increase. There are two reasons our system cannot be directly compared against the POMDP. Firstly, it is not possible to replicate the POMDP work to compare against directly without the observation model. Secondly, POMDP is implemented at the collision determination level only (see Figure 2) and therefore does not manoeuvre. This means that late alerts and induced conflicts (see Section 5.1.2) are not accounted for.
Shane Degen
Page 50 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
2.3.1 Passive Ranging An interesting geometric approach to passive-only collision avoidance involves a technique henceforth known as passive ranging.
This method manoeuvres the
owncraft and filters converging range estimates via a form of triangulation [46]. The idea of manoeuvring the owncraft for range estimation derives from the bearing-only tracking (BOT) research field, Oshman and Davison [86] and Logothetis et al. [87]. It first appeared in UAS obstacle avoidance in Frew and Rock [94].
They investigated related issues like constraints on camera FOV and
measurement uncertainty. Other related research investigates different approaches for optimizing the manoeuvre to increase awareness or reduce convergence time, e.g. information theoretic approaches, Logothetis et al. [95]. The first time passive ranging appeared in UAS obstacle avoidance was in 2005, Calise et al. [35].
They use optimal manoeuvring to obtain converging range
estimates on obstacles. In their paper, the camera is modelled similar to the CAS sensor used in this thesis (see Section 4.1). Passive ranging was first used for UAS collision avoidance by Shakernia and Chen [25]. They achieved, in around 7 seconds, a range error convergence of 510%, at a 120ft of a 500ft climb manoeuvre by exerting a 1.16 g manoeuvre for their particular encounter scenarios. They noted that this climb manoeuvre is efficient because a „perpendicular to line-of-sight‟ manoeuvre is required for quick convergence and is close to optimal. They compared against a “dog-leg” lateral manoeuvre that did not perform as well. Voos [30] proposed a method for image-based (passive-only) collision avoidance that does not require a manoeuvre. It filters range information with an EKF using time to collision (see Section 3.1.3) information by monitoring intruder expansion (pixels) in the image. This time-to-collision information takes around 3-4 seconds to converge.
Shane Degen
Page 51 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
2.4 Discussion The work done in this thesis is closest to the work done by Voos [30]. The literature discussed in Section 2.3 filtered range information by either manoeuvring the owncraft or by including image expansion and relative bearing rate. It could be argued that a system that manoeuvres an owncraft is not passive, as it requires action on the owncraft‟s behalf. The work presented in this thesis does not manoeuvre the owncraft unless it deems collision imminent. Shakernia et al. [25] and Frew [34] manoeuvre the aircraft in order to get converging range estimates. Shakernia et al. [25], Frew [34] and Voos [30] require time for these range estimates to converge. This convergence time ranges from 4-15 seconds, which can be critical in a pending collision scenario. The work in this thesis is similar to Voos [30] in that it uses image area expansion and relative bearing rate . However, instead of filtering a range estimate and then making a decision based on estimated intruder position, it makes a decision directly from these image-based features (image area expansion and relative bearing rate ). This is in a fashion similar to that of image-based visual servoing research, Chaumette and Hutchinson [96, 97]. Thus, the following research does not have to wait for a filter to converge on range and can act almost immediately (within three camera frames ~ 0.12 seconds).
Shane Degen
Page 52 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
3 Collision Avoidance This chapter presents the research that decides whether a collision is deemed imminent using image based characteristics (Equation (3.15)). In addition, we show the equations and algorithm that decide the avoidance manoeuvre based on imagebased characteristics.
3.1 Collision Determination In this sub-section, we investigate the features in an image that directly affect a conflict scenario, namely, intruder relative-bearing rate and image expansion. Relative bearing rate (radians/sec) of the intruder is measured with respect to a fixed body axis coordinate axes on the owncraft. Image expansion (pixels/second) is the intruder‟s one-dimensional growth in the image. Image area expansion (pixels2/second) is the intruder‟s two-dimensional growth in the image, which can give greater resolution (as discussed in Section 3.1.3). We also develop the collision determination algorithm whereby the owncraft decides whether to avoid the possible upcoming collision. This addresses the first major research goal (see Section 1.3) and represents a large portion of the novel contribution.
YES
Measure κ and μ
Collision Avoidance System
What Sector (κ )?
Manoeuvre accordingly
Is f(κ, μ) < threshold?
NO
Normal Controller
Figure 5 – Collision avoidance system decision process
Shane Degen
Page 53 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Figure 5 shows that the relative-bearing rate and image expansion are used as a decision factor in whether the collision avoidance system (CAS) will manoeuvre. The CAS makes a decision about the avoidance manoeuvre based on the sector of the intruder (as discussed in Section 3.2.1).
3.1.1 Thresholding Technique for Collision Decision To decide whether a collision is imminent, a thresholding technique is applied to the test statistic for collision avoidance C AS later shown in Equation (3.15). This system is depicted in Figure 5. Tk Tk+1
Sk+1
Sk
κk+1 κ
κk
μ
Rk+1 Rk
κ
Figure 6 – Geometry of a conflict scenario evolving over time
3.1.2 Closest Point of Approach Distance The first characteristic in an image that has relevance to the closest point of approach (CPA) or miss distance, is the angular velocity of the centroid-image-ofthe-intruder across the FOV in the owncraft‟s compensated camera frame. This is termed the relative-bearing rate of the intruder. When considering conflict scenario geometry, both the Regan and Gray [98] and the Australian Transport Safety Bureau [99] state that in terms of dynamic targets, a collision becomes imminent when the
Shane Degen
Page 54 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
intruder stops moving in the inertial frame. This is shown in Figure 6. Where fov is the camera field-of-view; a mid air collision (MAC) will occur if and only if
0, [ fov, fov] t
(3.1)
However it is not just a MAC that needs to be avoided; a NMAC needs to be reported to the authorities, so an ideal CAS would avoid colliding and not come within 152.4m of the intruder (the defined NMAC zone). Aviation standards [16] recommend for a CAS a minimum separation (CPA) of two aircraft always be more than 152.4m away from each other. Tk d
Sk
κk
μk
Figure 7 – Miss distance relationships
Another characteristic that has direct impact on CPA distance is image expansion . From Regan and Gray [98], the miss distance (CPA) is defined as: I
nd
/ t / t
(3.2)
Where I n is an integer and d is from Figure 7. Also is defined as the angle subtended by the intruder, thus t or is defined as the image expansion. It is evident from Equation (3.2) that as / t approaches zero then so does the CPA
Shane Degen
Page 55 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
distance, making a NMAC more likely. Also, if t
t then a NMAC is
likely to occur because the CPA distance would tend to zero. An important note is that intruder size d is not known a priori for a conflict scenario.
3.1.3 Time to Collision and Image Expansion Hoyle [100] presented a description appropriate for collision geometry. Time to collision TTC , also known as time to pass (for the non-colliding case), is:
TTC
R . R / t
(3.3)
Where R is the range to the intruder. Using the small angle approximation
tan , Regan and Gray [98] showed that: TTC
/ t
(3.4)
d
R
R
κ
d
κ
Figure 8 – Image plane characteristics
The angle subtended by the intruder is a one-dimensional value. However, it is expected that more accuracy (higher detail) could be attained from the twodimensional equivalent, termed angular-area subtended . Thus a greater resolution
Shane Degen
Page 56 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
or estimation would be achieved for TTC using . This value for would be easily measured in images as they are planar (two dimensional). If an image pixel is considered to have approximately a square relationship (from Figure 8). Then the angular-area subtended by a pixel is:
2
(3.5)
It is shown (in APPENDIX C) that the relationship between TTC and this angular-area subtended is: TTC
2 t
(3.6)
T
S
R T
σS
σR κ κ
Figure 9 – Geometry of a conflict scenario
3.1.4 Collision Determination Algorithm Most of the collision avoidance approaches so far have used range Rk in the control law for ensuring that the miss distance is always larger than 152.4m or some other threshold.
In this research, we propose the use of passive-only sensors,
therefore knowledge of Rk is not directly observed. Thus, we will determine a dimensionless test statistic using conflict scenario image characteristics (time to collision TTC and relative bearing rate k ) to ensure that Rk 152.4 m.
Shane Degen
Page 57 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
From Figure 9 it is known that for R , the corresponding is:
tan
T S
(3.7)
Using small angle approximation, which holds true for incremental time steps
T S
(3.8)
Inside this small angle approximation T is constant (as reflected in Figure 9), and
SR
(3.9)
1 R
(3.10)
Thus,
Taking the time derivative of (3.10)
1 2 t R
(3.11)
From (3.3) and utilizing that R / t is constant or approaching a constant for conflict scenarios,
TTC R
(3.12)
Equations (3.1) and (3.2) showed that risk of collision increases (CPA distance decreases) as approaches zero. It is therefore intrinsic that risk of collision also increases as TTC approaches zero, particularly if 0 . For a thresholding technique in accordance with Figure 5, we could find a dimensionless test statistic C AS by relating (3.11) and (3.12),
CAS f ( TTC )
1 R2 2 R TTC 2
(3.13)
C AS
(3.14)
CAS
(3.15)
Where the inputs of (3.15) are directly determined from the image using (3.16) and (3.17), i.e. and , respectively.
Shane Degen
Page 58 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
k k k 1
(3.16)
k k k 1
(3.17)
Where k is derived later in Equation (4.32).
3.2 Avoidance Manoeuvre In this sub-section, we look at what the owncraft should do when it has decided that a collision is imminent. Using image-based features, what is the best possible avoidance manoeuvre? This addresses the second major research goal (see Section 1.3).
3.2.1 Background From literature and aviation practice, the avoidance manoeuvre is typically determined by the intruder‟s position, i.e. the relative bearing decides the manoeuvre direction [30, 101]. This is driven by aviation regulations about which aircraft has right-of-way for given scenarios, which are all position-based (ICAO [102], FAA [103], CASA [104]). These aviation rules have exceptions to them, depending on whether the intruder is unpowered or comparatively unresponsive. Because the image-based collision avoidance system proposed in this research is unable to discriminate the responsiveness of the intruder, the owncraft gives-way for all encounter scenarios. It is important to mention that aviation specifications [16] state that a CAS should have a horizontal field of view of ±110˚ and a vertical field of view of ±15˚ as reflected on Figure 10. Intruders in the rearward sector are considered overtaking and thus they must give-way. Authors have taken the above specifications and safety regulations and proposed a method for avoidance based on sectors. Voos [30] and Sislak et al. [101] use this sector based technique for their avoidance manoeuvre. Note from Figure 10 that Sector 1 has no particular size. In addition, Figure 10 illustrates our particular CAS sensors, which have a field of view (FOV) of 60˚. Although the image expansion of Shane Degen
Page 59 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Section 3.1.3 is measured from this camera, it is proposed to use a third gimballed camera with a narrow FOV (e.g. 10˚ x 8˚) to get more accuracy in the TTC information of Equation (3.6). Voos [30] and Sislak et al. [101] separate the sectors according to: Sector 1 – Oncoming Intruder – Each aircraft should alter course to the right. Sector 2 – Rearward Intruder – The intruder must give-way. Sector 3 – Starboard Intruder (right-hand) – The owncraft must give-way. Sector 4 – Portward Intruder (left-hand) – The intruder must give-way.
Sector 3
110˚ 60˚
Sector 2
Sector 1
Sector 4
Figure 10 – Collision avoidance right-of-way sectors
These sectors represented here do not account for the previously mentioned exceptions to the general give-way policy. In this thesis, the intruder detection system does not identify the type of intruder, so all intruders will be treated as comparatively unresponsive.
Thus, regardless of the intruders sector or
responsiveness, the onus is on the owncraft to manoeuvre.
Shane Degen
Page 60 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
3.2.2 Relative-Bearing Based Manoeuvre The method that is implemented for manoeuvring in this thesis aligns with aviation rules and is similar to the previously discussed work of Voos [30] and Sislak et al. [101]. The avoidance manoeuvre is made based upon the relative bearing of the intruder. However, as Sector 1 of Figure 10 is half the FOV of our vision sensor, it can introduce some problems to the developed method, increasing collision risk unnecessarily. Thus, in this work Sector 1 is divided between Sector 3 or Sector 4 respectively. The owncraft will turn to the right ˚ (tau degrees) if the intruder is on the right or it will turn to the left ˚ if the intruder is on the left, in accordance with: Ck Ck 1 signum(
(a)
(b)
(c)
(3.18)
(d)
Figure 11 – Typical encounter scenarios
Where Ck is the commanded heading at time k . For the encounter scenarios in Figure 11, Equation (3.18) would cause a right hand turn because is positive (intruder is on the right). The results using Equation (3.18) are shown and discussed in Section 5.3. TABLE I reflects the algorithm in which this avoidance manoeuvre is implemented.
Shane Degen
Page 61 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems TABLE I
ALGORITHM FOR AVOIDANCE MANOEUVRE
Avoidance Manoeuvre Algorithm
/ max (16 seconds for Flamingo) IF intruder not passed IF C AS < threshold & Not in a turn (setturn==0) setturn=1 Turn for time ENDIF IF CAS < (threshold + buffer) & In a turn (setturn==1) Maintain turn for extra time ENDIF ENDIF
Shane Degen
Page 62 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
4 Modelling & Simulation In this chapter, we present the mathematical development behind the UAS model, control and the CAS sensor configuration. We also describe the architecture of the simulation environment that is developed to validate the proposed CAS.
4.1 UAS Model We model the UAS used in the image-based collision-avoidance simulationenvironment (IBCASE), to verify the collision determination and avoidance manoeuvre developed in this thesis. We develop the equations of motion represented by Equations (4.11) - (4.15). yb-axis Y, v M, q
Vt
α β L, p
xb-axis
X, u
N, r Z, w zb-axis
Figure 12 – Owncraft model used to define linear and angular variables
Our dynamic model is based on a nonlinear 6-dof rigid-body dynamic model [105]. We have used the aerodynamic coefficients for a Silvertone Flamingo UAS (owncraft) [37], shown in Figure 12. Our simulation environment emulates this model, given that this platform represents the experimental test-bed for future flight
Shane Degen
Page 63 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
trials. However, it could easily be adapted for any UAS given coefficients and other relevant data. NOMENCLATURE
TABLE II
Nomenclature Quantity
Axis
Units
Velocity, angle-of-attack, sidelslip
Vt ,
m/s, rad, rad
Aerodynamic force components (body)
X ,Y , Z
N
Aerodynamic moments (body-axes)
L, M , N
N∙m
Translational velocities (body-axes)
u, v, w
m/s
Angular velocity (body-axes)
p, q, r
Rad/s
Euler Angles (roll, pitch, yaw)
Rad
Position (earth-axes – NED)
xe , ye , h
m
Aileron, elevator, rudder, throttle
ail , el , rud , th
Engine thrust, owncraft mass
T,m
N, kg
Wing area, chord, span
S, c , b
m2, m, m
q, g
Pa, m/s2
Dynamic Pressure, gravity
4.1.1 Owncraft Coefficients The particular aerodynamic coefficients and other relevant data are presented in APPENDIX A. TABLE II shows the nomenclature used for the following sections. The equations of motion use body-axes coefficients from Equation (4.2) instead of wind-axes coefficients (CD , CY , CL ) .
These coefficients vary dynamically with
respect to state input (Vt , , , p, q, r ) and control surface deflections‟
(ail , el , rud , th ) :
Shane Degen
Page 64 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
CL CL0 CL CLq q CD CD0 kkCL
c 2Vt
(4.1)
2
C X CL sin CD cos CY CY CYr rud CYr r
b 2Vt
(4.2)
CZ CL cos CD sin Cl Cl Clr rud Cla ail Cm Cm0 Cm Cme el
b Cl p p Clr r 2Vt
c Cmq q Cm 2Vt
Cn Cn Cnr rud Cna ail
b Cn p p Cnr r 2Vt
(4.3)
4.1.2 Atmospheric Model The dynamic pressure q (Pa) is obtained from pressure, P (Pa) at altitude h (m): g M air
L h Rg Lapse P P0 1 aspe T0 TABLE III
Quantity
(4.4)
ATMOSPHERIC MODEL VARIABLES
Atmospheric Model Variables Symbol Value
Units
Gravity
g
9.80665
m/s2
Air pressure @ 0m (STP)
P0
101 325
Pa
Temperature @ 0m (STP)
T0
288.16
˚K
Idel gas constant
Rg
8.31447
J/(mol∙˚K)
Molar mass of air
M air
0.0289644
kg/mol
Lapse rate
Lapse
0.0065
˚K/m
Shane Degen
Page 65 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Then density (kg/m3) at that altitude is:
P M air Rg (T0 Lapse h)
(4.5)
1 q Vt 2 2
(4.6)
Thus dynamic pressure is:
g M air
L h Rg Lapse M air P0 1 aspe T0 q Vt 2 2 Rg (T0 Lapse h)
(4.7)
TABLE III shows the particular variables used to calculate Equation (4.7).
4.1.3 Navigation Equations The owncraft navigation equations are defined in the flat-earth north-east-down (NED) axes [105, 106]. These equations assume a stationary centre of gravity (CoG) with constant mass and uniform gravitational field. They also ignore the rotational forces of engine, i.e. heng 0 .
qSC X T m qSCY v pw ru g cos sin m qSCZ w qu pv g cos cos m
u rv qw g sin
(4.8)
u Vt cos cos v Vt sin
(4.9)
w Vt sin cos Vt u 2 v 2 w2 w tan 1 u v sin 1 Vt
Shane Degen
(4.10)
Page 66 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Vt
uu vv ww Vt uw wu u 2 w2 Vt v vVt
Vt
2
v 1 Vt
(4.11)
2
The time derivative of the translational states Vt , , in Equation (4.11) are calculated from Equation (4.8) and (4.9) using previous time instance values from Equation
(4.10)
(starting
with
a
trimmed
condition
i.e.
Vt0 u0
and
0 p0 , q0 , r0 , v0 , w0 also with 0 p0 , q0 , r0 , u0 , v0 , w0 ). The moment Equations (4.12) and kinematic Equations (4.14) are: p c1r c2 p c4 heng q qSb c3Cl c4Cn q c5 p c7 heng r c6 p 2 r 2 qScc7Cm (4.12) r c8 p c2 r c9 heng q qSb c4Cl c9Cn .
Where the coefficients used by Equation (4.12) are in Equation (4.13)
I X I Z I ZX 2 c1 IY I Z I Z I ZX 2 c2 I X IY I Z I ZX c3 I Z c4 I ZX
c5 I Z I X / IY
(4.13)
c6 I ZX / IY c7 1/ IY c8 I X IY I X I ZX 2 c9 I X
Shane Degen
Page 67 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
p tan q sin r cos q cos r sin q sin r cos cos
(4.14)
Finally, the owncraft navigation equations are:
xe uCS v(CS S S C) w(CS C S S ye uS C v( S S S C C ) w( S S C CS (4.15) he uS vCS wCC. Where C , S are for cosine and sine respectively. O
The owncraft‟s position
e ( xe , ye , h) , and attitude e ( , in the earth-axes are calculated by
Euler integration involving Equations (4.11) - (4.15). Please refer to TABLE IV on coordinate reference frames for the appropriate meaning of the suffix subscript for the images, intruder position and owncraft position. TABLE IV
Symbol
Reference Coordinate Frames Coordinate frame
imn
Image frame of nth camera (n=[1,2])
c
Common camera frame (relative as if one camera)
b
Body fixed coordinate frame
e
Earth fixed coordinate frame, on earth (NED)
N
Earth fixed coordinate frame on UAS
comp
Shane Degen
REFERENCE COORDINATE FRAMES
Attitude compensated coordinate frame
Page 68 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
4.2 UAS Controller The control approach for the UAS is divided into inner and outer loops. Where the superscript ( e ) is the error signal and ( C ) is the commanded signal and Pxyz , I xyz , Dxyz & Pff xyz are the PID gains on control surface xyz that are shown in TABLE
XIII of APPENDIX A. The inner loops that stabilise the owncraft are: Aileron from roll (Figure 13). This controller determines how much to deflect the aileron ail by applying a proportional gain Pail to the error signal between the measured bank angle and the commanded bank angle C .
ail Pail ( P e ) C
C
e
P
-+
(4.16)
e -+
Owncraft Dynamics
Pail ail
,
Figure 13 – Aileron from heading and roll
Rudder from sideslip (Figure 14). This controller allows for coordinated turns. It applies a deflection directly to the rudder rud that is determined by applying a feed forward proportional gain Pff rud to the calculated angle of sideslip .
rud Pff rud
Pffrud
rud
Owncraft Dynamics
(4.17)
,
Figure 14 – Rudder feed forward from sideslip (for coordinated turns)
Shane Degen
Page 69 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Throttle for speed hold (Figure 15). This controller determines throttle deflection th by applying a Proportional-Integral gain ( Pth , Ith ) to the error signal between the commanded velocity Vt C and the measured velocity Vt . k
th Pth Vt e I th Vtke
(4.18)
n 0
Pth
Vt e
Vt C +-
Vt
th
+ +
Vt , h
Owncraft Dynamics
I th
Figure 15 – Throttle for airspeed hold
The outer loops are for guidance and navigation purposes. These are: Altitude hold using elevator (Figure 16). This controller determines an elevator deflection el by applying a Proportional-Integral-Derivative (PID) gain ( Pel , I el , Del ) to the error signal between the commanded altitude hC and the measured altitude h . k
el Pel he I el hke Del (hke hke1 ) (4.19) n 0
Pel
he
hC +h
I el
t
Del
+ + +
el
Owncraft Dynamics
h , Vt
Figure 16 – Elevator for altitude hold
Shane Degen
Page 70 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Heading hold using roll (Figure 13).
This part of the controller
determines the commanded bank angle C by applying a proportional gain P to the error signal between the actual heading and the commanded
heading C .
C P ( C )
Shane Degen
(4.20)
Page 71 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
4.3 Camera Model In this sub-section, we model the perspective-projection camera (CAS sensor) onboard the owncraft with the geometry of the conflict scenario modelled relative to the CAS sensor.
4.3.1 Configuration We model the sensor onboard the owncraft combining two camera projective models in one common axis (as shown in Figure 17). When the intruder is observed in one of the cameras (either of the two black axes), it is projected onto the common image frame, which is assumed to be forward looking (in red). It is in this common image frame that the control avoidance law is defined. Aviation standards [16] have stated that a CAS should have a ±110˚ field-of-view (FOV) in the horizontal and ±15˚ FOV in the vertical. For the purposes of this research, two 60˚ FOV cameras in the horizontal have been modelled as reflected in Figure 17. We have followed this approach considering the following: Price and convenience – High quality sensors with these specifications are obtained cheaply and conveniently.
In addition, they can be
interfaced appropriately with powerful processors [32]; in the case where computation-intensive image processing is required.
They are also
lightweight and power efficient [29]. Lens Calibration – The image plane and lens distortion in this configuration is minimal when compared with omni-directional sensors [107]. FAA Accident Prevention Program Report – FAA report for manned aircraft [108] recommend pilots regularly scan ±60˚ FOV horizontally in order to prevent a mid air collision (MAC).
In keeping with ELOS
expectations, a similar approach could be considered appropriate for an automated system.
Shane Degen
Page 72 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
4.3.2 Perspective Projection Model In Figure 17, if an intruder is identified in one of the cameras using detection algorithms [33, 109], the intruder pixel location I Pimn un , vn , f (and in which camera the intruder is detected) is extracted and the CAS notified, which ultimately will determine if collision is imminent and possibly proceed with the avoidance manoeuvre. In Figure 17, cameras are n [1, 2] and f is the focal length of the cameras. 0imn is the origin of each individual image frame and 0c is the origin of the common camera frame after the transformation of Equation (4.23). is the relative bearing or azimuth to the intruder and is the elevation.
yb yim2
CoG zb
xb
b
yc
Tc(x)
xc yim1
b
Tc(y)
0im1 f
ye(E)
υ
0c xim1
f
0im2
xim2 zim2
f
-υ
zc
κ λ (u,v) zim1
R
xe(N)
ψ I
Πe(xe,ye,ze)
ze(D)
Figure 17 – Two-camera perspective projection setup
If I e is the position of the intruder in the earth-axes (NED) and O e is that of the owncraft, then I N ( xN , yN , zN ) is the intruder‟s position wrt the owncraft CoG, i.e. I
N I e O e .
(4.21)
Although I Pimn is obtained directly from the image, its relationship in the object space is:
Shane Degen
Page 73 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
I
Pimn
I N ( xN ) un f f I I vn I N I N ( yN ) (4.22) N ( z) N ( z) I N ( zN ) f
In this context, TABLE IV shows the subscript notation. Equation (4.23) rotates the camera (rad) through the initial y-axis, where fov / 2 . For the image plane of the nth camera, the intruder position I Pc is now I
Pc y () I Pimn
(4.23)
For the purposes of this thesis, the rotations through angle a wrt to axes ( x, y, z ) are: 0 0 1 x (a ) 0 cos a sin a 0 sin a cos a cos a 0 sin a y (a ) 0 1 0 sin a 0 cos a cos a sin a 0 z (a sin a cos a 0 0 0 1
(4.24)
This sub-system subsequently transforms I Pimn of the two rotated cameras to the image plane of one forward-looking camera I Pc (red axis and image plane of Figure 17). Then all control law is developed on I Pc . The intruder‟s position in the bodyaxes I Pb is then, I
Pb bTc bc I Pc
(4.25)
The cameras‟ focal point wrt the UAV CoG (translation) in the body-axes is b Tc . If b Tc , then , and are typically < 2m, which is seemingly T
insignificant compared to the distances the intruder are at and can be ignored [110]. Thus, Equation (4.25) simplifies to
Shane Degen
Page 74 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems I
Pb bc I Pc
(4.26)
For the purposes of this simulation, the UAS configuration of Figure 17 rotates bc z x 2 2
(4.27)
Thus the intruder pixel location rotated through to the UAV CoG I PN is I
PN be I Pb
(4.28)
The intruder‟s position needs to be monitored with the UAV motion compensated for, such that relative bearing and azimuth are measured irrespective of the UAV‟s behaviour. Because the NED earth-axes align with the UAV body-axes, the only compensation considered necessary will be the Euler angles (attitude) of the owncraft. In addition, it is not necessary to compensate for heading, but rather heading changes, i.e. a north-always pointing CAS sensor is not necessary. Therefore
be z (( 0 )) y (x () (4.29) 0 is the heading at the point where the intruder is first detected. Thus, the motion compensated location of the intruder I Pcomp , wrt to a wings-level, original heading is:
Pcomp bc I Pe
(4.30)
Pcomp bc be bc I Pc
(4.31)
I
I
Then, I
tan 1
f I
tan
1
Pcomp ( xcomp ) Pcomp ( ycomp )
(4.32)
f
For now will be neglected and this thesis will concentrate on using . In accordance with Equation (3.17) the relative bearing rate of the intruder is:
k k k 1
Shane Degen
(4.33)
Page 75 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Image in camera - I Pc 2
(mm) 1 0 -1 -2 6
4
2
0
-2
-4
-6
(mm) I Image in motion compensated camera - Pcomp
2
(mm) 1 0 -1 -2 6
4
2
0
-2
-4
(mm)
-6
Figure 18 – Image of intruder as seen, without compensation, in the camera frame (top) and with motion compensation (bottom) as calculated. The units are wrt the focal length in millimetres.
Figure 18 shows the evolving image of an intruder over time (40 seconds). The intruder is first detected (green circle) on the right (with 410 ) and the aircraft banks right and alters heading 20˚ right, considering that horizontal FOV is ±60˚. In the uncompensated image on top, one can see the aircraft bank right and the intruder pan right on the owncraft.
Shane Degen
Page 76 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
4.4 Simulation Environment In this sub-section, we describe the image-based collision-avoidance simulation environment (IBCASE) that is developed using the models of Section 4.1-4.3 to validate the image-based collision-avoidance control law of Section 3. The architecture of the simulation environment is reflected in Figure 19. It is designed to be generic and adaptable for any CAS sensor configuration or on any UAS. IBCASE is implemented using MATLAB. The system is divided into three main components. The UAV emulator propagates the UAV throughout time using the equations of Section 4.1. The conflict scenario emulator generates the trajectory of the intruder. The vision system emulator generates what the image would be onboard the owncraft.
4.4.1 The Vision System Emulator The vision sensor simulator is easily adapted for different CAS sensor configurations via the CAS sensor configuration block. This vision sensor simulator generates a motion compensated image I Pcomp using the known trajectory of the intruder from the intruder trajectory generator and the attitude of the owncraft e from the navigation equations block. It outputs to the collision avoider what the intruder detection system (from Figure 2) would generate, in terms of a pixel location for the intruder. The vision sensor simulator is developed in Section 4.3.2.
4.4.2 The Conflict Scenario Emulator The intruder trajectory generator propagates the track of the intruder based on various user inputs that are in the conflict scenario setup block. These inputs are time of simulation, intruder speed, random start and stop positions etc. The conflict
Shane Degen
Page 77 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
scenario emulator used in the experimentation for validating this CAS is discussed in Section 5.2. UAV EMULATOR
VISION SYSTEM EMULATOR
CONFLICT SCENARIO EMULATOR
Vision Sensor Simulator
Intruder Trajectory Generator
CAS Sensor Configuration
Conflict Scenario Setup
Collision Avoider
UAS Controller
Navigation Equations
UAS Coefficients
Figure 19 – IBCASE (simulator) architecture
4.4.3 The UAV Emulator The UAV emulator uses the UAS coefficients of APPENDIX A in the navigation equations block.
The navigation equations block propagates the owncraft
throughout the conflict scenario using the equations of Section 4.1. It operates in conjunction with the UAS controller block. The UAS controller block has all the proportional-integral-derivative (PID) controller gains that are tuned to give the owncraft an appropriate (realistic) response. For the Flamingo UAS operating at 25Hz, these PID gains are in TABLE XIII of APPENDIX A. The control law is prescribed in Section 4.2. Inside the collision avoider is the novel contribution of this research (see Section 3). It determines whether to make an avoidance manoeuvre.
Shane Degen
If it chooses to
Page 78 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
manoeuvre, then alternate desired heading and speed commands are issued to the UAS controller. The final responsibility for stability rests inside the UAS controller and is therefore outside the scope of the collision avoidance simulator. However, the open loop stability of the Flamingo is analysed and presented in APPENDIX B.
4.4.4 Simulator Adaptability The simulator is designed for easy adaptation of different aircraft, types of conflict scenarios and CAS sensor configurations.
The yellow blocks can be
interchanged to vary the experiment: The UAS coefficients block can be substituted with data for any UAS, the data used in this simulation is based on the Flamingo UAS from Silvertone [37]. The Flamingo data is found in APPENDIX A. For each new UAS coefficients block a corresponding UAS controller has to be defined that has the appropriate PID gains. The gains for the 25Hz Flamingo model are in TABLE XIII of APPENDIX A. Also adaptable is the CAS sensor configuration block; one can easily redefine resolution, field-of-view, number of cameras, image sensor size etc. In addition, the conflict scenario setup block can be interchanged for different experiments testing various types of conflict scenarios. The conflict scenario setup block used in this thesis is described in Section 5.2.
Shane Degen
Page 79 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
This page is intentionally left blank.
Shane Degen
Page 80 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
5 Results and Analysis In this chapter, we discuss performance metrics and present the experiment that tests the proposed image-based CAS. We display the results, and then present analysis and discussion.
5.1 Performance Analysis In this sub-section, we discuss existing measures and encounter models for assessing the performance of collision avoidance systems. There are three different metrics that we use to characterise the performance of the developed CAS. The first measure uses Standard Operating Characteristics (SOC) curves developed by Kuchar [111], which are assessed over a range of C AS thresholds (the test statistic developed in this thesis). The second method assesses Risk Ratio, which is a measure of UAS performance in a conflict scenario (NMAC) with and without the CAS [112]. In the third method we assess the CAS using Dalamagkidis et al. [10] ELOS expectations for a collision scenario (MAC) at a nominal C AS threshold. The C AS test statistic that is chosen for ELOS expectation performance measuring is the one that has the lowest Risk Ratio (10).
These three methods of displaying the CAS results
endeavour to benchmark the performance of the developed CAS against existing systems.
5.1.1 Encounter Models Before we start to discuss performance measures, it is important to understand encounter models. Comprehensive encounter models gather radar and surveillance data in a given NAS and analyse it to generate realistic encounter scenarios for CAS testing in a simulation environment [113]. Models have evolved over the last few decades and today incorporate non-cooperative encounter data [114] (known as uncorrelated models). Perhaps the most comprehensive model is the Lincoln Labs
Shane Degen
Page 81 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
model [115]. It uses more than 420 000 encounters from 127 different radar sites in the US, and learns the airspace encounter model using dynamic Bayesian networks. This model can then generate random conflict scenarios that are statistically representative of actual encounter scenarios of the NAS, with a known expected rate for each encounter type.
These scenarios are then used to validate CAS‟s in
simulation. To undertake development of a comprehensive encounter model is beyond the scope of this research, instead we use a Monte Carlo simulation with a comparatively simplistic encounter scenario (detailed in Section 5.2). For the expected rate of occurrence for a mid air collision EMAC (MACs/flight hour) we use the Class E airspace worst-case statistic from Weibel and Hansman [9].
5.1.2 Performance Measures 5.1.2.1 Standard Operating Characteristic Curves There are two measures for characterising the performance of a CAS [111]. The first is the success rate of the system, i.e. the probability that the UAV will avoid a conflict scenario given that a conflict scenario is inevitable ( PSA and PCD from below). The second is the false alarm rate i.e. the probability that the UAV will attempt to avoid a collision when there is no conflict scenario ( PUA and PFA from below). These probabilities change with various C AS thresholds. Adjusting the CAS sensitivity to increase success rate will consequently increase the false alarm rate. To capture this information, Kuchar [111] uses standard operating characteristic (SOC) curves. These are an adaptation from signal detection theory [116] where they are used to detect signals amongst background noise at various thresholds (known as receiver operating characteristic curves). In these plots, Kuchar represents the probability of correct detection PCD against the probability of a false alert PFA . An example plot that displays the line-of-little-benefit is shown (dash-dot line) in
Shane Degen
Page 82 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Figure 20. If a result occurs under the line-of-little-benefit then the false alert rate is higher than the success rate, thus the system is of little benefit. Example Standard Operating Characteristic Curve 1 0.8
PCD
0.6 0.4 0.2 0
0
0.2
0.4
0.6
0.8
1
PFA
Figure 20 – Example of a standard operating characteristics curve [111]
Winder and Kuchar [117] break down these probabilities ( PCD and PFA ) to reveal all the various possibilities in an encounter. TABLE V represents the various possibilities for the outcomes illustrated in Figure 21.
TABLE V
Category
POSSIBLE OUTCOME CATEGORIES [115]
Abbreviation
Alert
Alert
Conflict
Necessary?
Issued?
Occurred?
False Alert
FA
Induced Conflict
IC
Correct Avoidance
CA
Late Alert
LA
Missed Detection
MD
Proper Rejection
PR
Shane Degen
Page 83 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
FA alert region IC CA LA conflict region MD
NMAC
PR RF A
Figure 21 – Possible outcomes for UAS with collision avoidance system [115]
Kochenderfer et al. [115] use PSA and PUA for their SOC curves. Although we have displayed the results in this manner, we have also chosen to plot the SOC curves using PCD and PFM . The various probabilities are defined: 1. Probability of Conflict PCon
PCon
IC LA MD FA IC CA LA MD PR
(5.1)
2. Probability of Alert PAlert
PAlert
FA IC CA LA FA IC CA LA MD PR
(5.2)
3. Probability of Satisfactory Alert PSA
PSA
FA CA FA IC CA LA MD
(5.3)
4. Probability of Unnecessary Alert PUA
PUA
Shane Degen
FA IC FA IC CA LA MD
(5.4)
Page 84 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
5. Probability of Conflict with No Action PCNA
PCNA
CA LA MD FA IC CA LA MD PR
(5.5)
6. Probability of Correct Detection PCD
PCD
CA CA LA MD
(5.6)
7. Probability of False Manoeuvre PFM
PFM
FA IC FA IC PR
(5.7)
In Figure 21, the conflict region is defined by a cylinder of radius 152.4m, the NMAC radius around the owncraft. The alert region is the region wherein it is possible for the system to alert i.e the false detection region. The radius of the alert region increases as the sensitivity of the CAS is increased (by increasing the C AS threshold). In this research it was found that for the highest C AS threshold tested (16), no manoeuvres were triggered when the aircraft did not have a CPA distance greater than 1km (when tested 10 000 times). Thus the alert region has a radius of 1km and hence Yupper 1km of Section 5.2.1.
5.1.2.2 Risk Ratio Risk Ratio is a measure that has been traditionally used to assess performance of a TCAS [112]. RiskRatio is the probability that a NMAC will occur with a CAS against the probability it will occur without the CAS. Lincoln Labs [111, 115] and Eurocontrol [112, 118] have published TCAS/ACAS RiskRatio results. RiskRatio
PNMACwithCAS PNMACwoCAS
(5.8)
Using TABLE V would give:
IC LA MD RiskRatio IC FA CA LA MD PR (5.9) CA LA MD IC FA CA LA MD PR
Shane Degen
Page 85 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
RiskRatio
IC LA MD CA LA MD
(5.10)
RiskRatio is assessed over the alert region as shown in Figure 21. Risk Ratio consists of two components [111], one that is due to the Induced Conflicts RRIC , and another from the unresolved component RRunresolved i.e. where avoidance failed.
RiskRatio RRIC RRunresolved
RRIC
IC CA LA MD
RRunresolved
LA MD CA LA MD
(5.11) (5.12) (5.13)
The current TCAS results (from a correlated model, that is using scenarios where coordinated avoidance manoeuvres takes place) for RiskRatio from ICAO is 3.3% [112] and within FAA is 5.5% [115]. For the uncorrelated model (uncooperative scenarios, relevant to this research) the TCAS figures from ICAO are 22.9% [112] (with RRIC at 13.7%) and 23% in America [119].
5.1.2.3 ELOS expectations In Section 1.1, we discussed how UAS are expected to have an equivalent level of safety (ELOS) to that of manned aircraft in order to enable/facilitate free integration into the NAS [8-11, 25, 45, 120]. The National Transportation Safety Board (NTSB) published figures for the probability of fatalities in piloted aircraft are
PFatality 106 hr 1 , but a more conservative figure like PFatality 107 hr 1 should be expected [28]. According to Dalamagkidis et al. [28] it is reasonable to assume that for UAS, a mid air collision will result in a human fatality. Then from NTSB data from 1983 to 2006 the probability of a MAC, PMAC 107 hr 1 is proposed for UAS [10, 28]. On the other hand, Eurocontrol use PMAC 3 108 [112]. For the NMAC case, Eurocontrol use PNMAC 3 107 [112] and PNMAC 1.7 107 [118].
Kuchar and
Drumm [119] confirm these MAC rates. Weibel and Hansman [9] use a lower PMAC
Shane Degen
Page 86 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
rate because it is based on class E airspace data. We will use the figures of Weibel and Hansman [9] for our assessment because Class E airspace is similar to Class G airspace, which is where this CAS is designed to operate. Weibel and Hansman [9] use cooperative data in the Endoh aircraft collision gas model [121] for encounter modeling, to define the expected number of collision scenarios an owncraft is likely to encounter EMAC ( MACs / hr ).
An expected
collision occurs if the exposure volume overlaps with the UAV, the expected number of collisions is equal to the ratio of total collision volume to the volume of airspace [9].
Using the NTSB data, Weibel and Hansman [9] determine a figure for
EMAC 4 105 collisions / hr (at FL370).
A worst-case conservative estimate of
EMAC 104 collisions / hr is proposed [10].
Dalamagkidis et al. [10] further develop Weibel and Hansman‟s [9] formula to include that an owncraft that has a CAS may manoeuvre and avoid a collision, known as the Risk Ratio that pertains to the MAC case RRMAC . RRMAC
PMACwithCAS PMACwoCAS
(5.14)
If the conflict region of Figure 21 is defined as a MAC (please see definition in Section 1.2.2), then
PMAC EMAC RRMAC
(5.15)
Putting the proposed EMAC back into Equation (5.15) will give: RRMAC 103
(5.16)
This estimate is based on Class E airspace and the same assumptions cannot be made for Class G airspace because one is not able to monitor the traffic in Class G [10].
Shane Degen
Page 87 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
5.2 Experiment Setup In this section, we outline the system aim and the experiment, detailing the objectives, scope and limitations of the simulation. In this research we aim to prevent a near mid air collision (NMAC) laterally. This means that the above controller is designed to make sure that the two encountering aircraft stay more than 152.4m away from each other laterally, at all times. Although we are aiming at preventing NMACs, we also assess the MAC performance. The objective of the proposed image-based CAS is to use an image-based sensor to detect and avoid a conflict scenario, without inferring range. In order to validate the performance of the proposed image-based collision avoidance system there are two principle objectives that drive the two experiments. These are: 1) To see how successful the CAS is at avoiding a conflict scenario. This is called the success rate experiment and from this experiment we will get PSA and PCD . 2) To see how often the CAS performs an unnecessary avoidance manoeuvre, and what is the outcome? This called the false alarm experiment and we will obtain PCon , PAlert , PCNA , PFM and PUA .
5.2.1 Monte Carlo Simulations The overall Monte Carlo simulation was run 50 000 times for both experiments described above. These experiments make sure that at some random point X m , in the owncraft‟s straight two-minute voyage, that an intruder will come within Ym metres of the owncraft. A small selection of 50 intruder tracks (green thin lines) is shown in Figure 22. In Figure 22, the owncraft (thick blue line) is not attempting to manoeuvre out of the way. The circles represent the beginning of the tracks. This experiment operated within the following scope:
Shane Degen
Page 88 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
1. The point X m , of the two minute voyage is random and between 15 seconds and 120 seconds. It has a normal distribution. a. The bottom bound of 15 seconds is implemented because: i. In the first 3 seconds nothing happened apart from owncraft stabilisation and trimming; ii. the next 12 seconds, comes from the fact the UAS are expected to perform with an ELOS to manned aircraft [8, 25, 45, 120] and the literature states that a pilot takes about 12.5 seconds to detect and react [122]. Therefore, an unmanned system would not be expected to avoid anything in less than 12 seconds. b. The top bound of 120 seconds is a reasonable figure that is used; it had no real significance, other than to affirm that few real life conflict scenarios would take longer than two minutes from first detection until passing, to play out. Some encounter models assess one minute collision scenarios [113]. 2. The value for Ym had a lower bound Ylower and an upper bound Yupper (metres). a. For Experiment 1; Ylower 0m and Yupper 152.4m . b. For Experiment 2; Ylower 0m and Yupper 1km . 3. The intruder had a random straight path.
It had a normally random
distribution for the beginning and end-points of the intruder‟s track. The following were applied to the intruder‟s path: a. The intruder‟s maximum velocity is 250 KTAS (Airspace E, G rules). b. The intruder began within the field-of-view of the CAS sensor (in our case ±60˚ horizontal and ±23.4˚vertical forward-looking field-ofview). c. The intruder did not start within 1km of the owncraft. Effectively this meant that the intruder is larger than 0.82m, because of the implemented pixel resolution and our intruder detection system, which is sub-pixel in nature [33].
Shane Degen
Page 89 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
8000
7000
6000
NORTH (m)
5000
4000
3000
2000
1000
0
-1000
-2000 -5000
-4000
-3000
-2000
-1000
0 1000 EAST (m)
2000
3000
4000
5000
Figure 22 – Random selection of intruder tracks encroaching owncraft
5.2.2 Limitations and Assumptions The Monte Carlo simulations described above are only a preliminary experiment to evaluate the performance of the proposed avoidance algorithm. Some of the assumptions and limitations of these simulations are: It does not consider the azimuth of the intruder and thus disregards altitudinal aspects of manoeuvring in a three dimensional manner.
Because of the
comparative responsiveness of the altitude controller and the smaller restrictions on separation distance (100 feet or 30.5 m), it is expected that a
Shane Degen
Page 90 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
3D version of this image-based collision avoidance system would have better results, perhaps even guaranteeing the separation distances of a NMAC zone. It only looks at detection of intruders with straight trajectories. It could be expanded to include looking at intruder‟s with curved trajectories It does not detect collisions whilst the owncraft is manoeuvring. That is, if the owncraft goes into a manoeuvre, the part that is monitoring the test statistic C AS , to see if it drops below the threshold, stops making decisions until the owncraft has returned to level flight. It does not actually monitor the complete ±110˚ horizontal FOV recommended by aviation standards [16]. There are also the limitations of the actual sensor in terms of all weather performance.
Image-based sensors, whether vision or infrared, do not
perform well in cloudy conditions [29].
5.3 Results and Analysis 5.3.1 CAS Threshold Determination A C AS threshold needs to be determined in order to assess a particular configuration for ELOS expectations (as discussed in Section 3.1.1). In order to find a reasonable value, we took the threshold that had the lowest RiskRatio . For the CAS developed in this thesis, CAS 10 . It was noticed that as the C AS threshold increased (more than 10), the IC component increased as well. Figure 23 shows the distribution for the minimum value of the C AS test statistic ( min(CAS ) ) over 10 000 simulations that were all inside the perimeter of the NMAC region.
Shane Degen
Page 91 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
5.3.2 Observations and Behavioural Patterns When the algorithm of TABLE I is implemented using of 16 seconds, we observe three behavioural patterns. The first, termed a single manoeuvre, is an owncraft that performs an avoidance manoeuvre but returns to the original heading immediately. The second behavioural pattern, called a maintained manoeuvre, turns the owncraft onto the altered heading (0 ) and when it achieves this new heading, continues to maintain it for seconds ( x number of times) before returning to the original heading.
The third type of behaviour observed, called a repeated
manoeuvre, sees the owncraft perform an avoidance manoeuvre and then immediately return, however, it performs a subsequent manoeuvre because the C AS threshold is again violated (risk of NMAC deemed high enough by CAS). Histogram of CAS
No. of Occurences
200
150
100
50
0
0
5
10
15 min(CAS)
20
25
30
Figure 23 – Distribution of min(CAS) for experiment 1
The 120-second tracks of Figure 24, Figure 26 and Figure 28 show an owncraft‟s original route (red dashed) that would have encountered an intruder (green dotted) with an original closest point of approach (green square). The owncraft‟s avoidance route is shown (blue solid) with the new closest point of approach (blue star). The circles show the start of either aircraft‟s time track.
Shane Degen
Page 92 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
The C AS behaviour plots of Figure 25, Figure 27 and Figure 29 displays the C AS (solid red line) with the thresholds shown here to be at ±16 (straight black lines). Because the C AS displays unstable behaviour during the avoidance manoeuvre, the last accurate reading is maintained (blue dotted line) until the new heading (ck ck 1 ) is attained.
Figure 24 depicts a single manoeuvre; it has the track of an owncraft that has turned onto the new avoidance heading and upon achieving it, has deemed the intruder as no longer a risk and immediately returned to the original heading. The associated C AS plot is shown in Figure 25.
From Figure 25, one can note the
avoidance manoeuvre is made at 3 seconds and initiates return almost immediately at 19 seconds, because it is then out of the threshold region. Figure 26 illustrates a maintained manoeuvre; it is an example of an owncraft that has detected an intruder, which has triggered an avoidance manoeuvre but then, once on the new heading, the C AS is still under the threshold (at around 18 seconds), so the new heading is maintained for (around 36 seconds). Figure 27 shows the
C AS behaviour of Figure 26 for the first 60 seconds. One can see the avoidance manoeuvre is made at 3 seconds and the new heading is achieved at about 18 seconds. However, the C AS threshold is still violated, so it maintains the heading for
and initiates return at around 36 seconds, achieving original heading around 53 seconds. Figure 28 shows the track for a repeated manoeuvre. This type is where an owncraft has manoeuvred and then deemed that it is safe to return to the original heading (because C AS is under a nominal threshold), however upon recovering the original track, the intruder again violates the C AS threshold and the owncraft repeats an avoidance manoeuvre. This can happen multiple times, although more than three successive manoeuvres were very rare. Figure 29 illustrates the C AS behaviour. It shows where the manoeuvre is triggered at 3 seconds and again at about 36 seconds. The returns are triggered at around 19 seconds and at 50 seconds.
Shane Degen
Page 93 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
3000
2500
NORTH (m)
2000
1500
1000
500
0 -1000
-500
0
500
1000 1500 EAST (m)
2000
2500
3000
Figure 24 – Example of scenario where an owncraft reaches new heading and immediately returns to the original heading. This is an example of a single manoeuvre. CAS 150
CAS
100
50
0
-50
0
10
20
30 Time (s)
40
50
60
Figure 25 – CAS behaviour for first 60s of Figure 24 track. The CAS test statistic (red line) is between thresholds (±16) therefore a manoeuvre is made (3 secs). The CAS is maintained at the last stable reading (blue dotted line) during the manoeuvre. At the new heading, it is deemed safe to return to the original heading (20 secs), where the stable CAS is held (blue dotted line) until on original heading.
Shane Degen
Page 94 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
3500
3000
NORTH (m)
2500
2000
1500
1000
500
0
-1000
-500
0 500 EAST (m)
1000
1500
Figure 26 – Example of scenario where owncraft maintains new heading until θ seconds before returning to original heading. This is an example of a maintained manoeuvre. CAS 150
CAS
100
50
0
-50
0
10
20
30 Time (s)
40
50
60
Figure 27 – CAS behaviour for first 60s of Figure 26 track. The CAS test statistic (red line) is between thresholds (±16) therefore a manoeuvre is made (3 secs). The CAS is maintained at the last stable reading (blue dotted line) during the manoeuvre. At the new heading, it is still not safe to return to original heading (20 secs), so the current heading is maintained for ϴ time until another CAS reading decides it is safe to return to original heading (36 seconds).
Shane Degen
Page 95 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
3500
3000
NORTH (m)
2500
2000
1500
1000
500
0
-1000
-500
0
500 EAST (m)
1000
1500
2000
Figure 28 – Example of scenario where owncraft avoids and returns to original heading, however CAS threshold is violated a second time. This is an example of a repeated manoeuvre. CAS 100 80 60 40
CAS
20 0 -20 -40 -60 -80 -100
0
10
20
30 40 Time (s)
50
60
70
Figure 29 – CAS behaviour for first 70s of Figure 28 track. An avoidance manoeuvre is made at 3 secs and then the CAS decision returns the owncraft to the original heading (19 secs). When the owncraft has returned to the original heading a second manoeuvre is performed (36 secs) and returns again (50 secs).
Shane Degen
Page 96 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
5.3.3 Probabilistic Results Figure 30 shows the distribution of CPA‟s for 10 000 simulations, where an avoidance manoeuvre is triggered (because of C AS threshold) but unnecessary (because CPA distance is greater than the NMAC distance), from experiment two. It shows the distribution for what would have been the CPA distance before the CAS is implemented, and the new CPA after the CAS manoeuvred the owncraft. Therefore, these correspond to the Induced Conflicts and False Alerts (Figure 21). Notice the Induced Conflicts, because in the left diagram there are no CPA instances under
Avoidance Manoeuvres with CPA > 152.4m 60
70
50
60
40 30 20 10 0
After Manoeuvre - New CPA
No. of Occurences
No. of Occurences
152.4m, but there are some shown in the right diagram (after the CAS manoeuvres).
50 40 30 20 10
0
200
400 600 800 min(CPA) - (m)
1000
0
0
500 1000 1500 min(CPA) - (m)
2000
Figure 30 – False Positive distributions before and after CAS is implemented
In this instance ( CAS 16 ), the probability that the CAS would manoeuvre falsely PFM is 51.21%. This is a measure of the oversensitivity and it is assessed over the entire spectrum of ranges where the collision detector could trigger a manoeuvre (for CAS 16 this has Yupper 1km and Ylower 0m ). increases as the C AS threshold increases.
Note that PFM
However, of those that unnecessarily
manoeuvred, on average the CAS would increase the CPA distance between the two aircraft 480.1m, and thus increase safety.
Shane Degen
Page 97 of 129
(a)
(b)
8000
3000
6000
2000
NORTH (m)
NORTH (m)
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
4000
1000
2000 0 0 -6000 -4000 -2000 EAST (m)
0
2000
-2000
(c)
(d)
3000
3000
-1000 0 EAST (m)
1000
2000
NORTH (m)
NORTH (m)
2500
1000
2000 1500 1000 500
0 0 0
1000 EAST (m)
2000
-1000
(e)
(f)
4000
5000
3000
4000
NORTH (m)
NORTH (m)
-1000
2000 1000
0 1000 EAST (m)
3000 2000 1000
0
0 -1000 0
2000 EAST (m)
4000
-2000
0 EAST (m)
2000
Figure 31 – A selection of Correct Avoidances made using implemented algorithm. (a) top left – left intruder approach with maintained manoeuvre (b) top right – right intruder approach with single manoeuvre (c) middle left – left intruder approach with single manoeuvre (d) middle right – right intruder approach with single manoeuvre (e) & (f) bottom – right intruder approach with repeated manoeuvre.
Shane Degen
Page 98 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
(a)
(b)
6000
5000 4000
NORTH (m)
NORTH (m)
5000 4000 3000 2000
3000 2000 1000
1000
0 -2000
0 -2000 0 EAST (m)
(c)
(d)
3000
3000
2500
2500
NORTH (m)
NORTH (m)
-4000
2000 1500 1000
2000 1500 1000
500
500
0
0 0
1000 EAST (m)
2000
-1000
(e)
0 2000 EAST (m)
0 1000 EAST (m)
(f) 6000
6000
5000
NORTH (m)
NORTH (m)
5000 4000 3000 2000
4000 3000 2000 1000
1000 0 -2000
0 2000 EAST (m)
0 -2000
0 2000 EAST (m)
Figure 32 – Another selection of Correct Avoidances made using implemented algorithm. (a) top left – left intruder approach with maintained manoeuvre (b) top right – right intruder approach with single manoeuvre (c) & (d) middle – right intruder approach with single manoeuvre (e) & (f) bottom – right intruder approach with repeated manoeuvre.
Shane Degen
Page 99 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Figure 31 shows a selection of cases where the owncraft successfully avoided the NMAC. Figure 33 shows a selection of instances where the controller failed to avoid the conflict. In the top left illustration of Figure 33 is a case of Miss Detection, this had a CPA (Ym ) of 93.5m, whereas the other three are Late Alerts.
3000
(b) NORTH (m)
NORTH (m)
(a)
2000
1000
3000 2000 1000 0 -1000 -2000
0
0 500 EAST (m)
-2000-1000 0 1000 EAST (m)
3000
3000
(d) NORTH (m)
NORTH (m)
(c)
2000 1000 0
2000 1000 0
-3000 -2000 -1000 EAST (m)
0
1000
-500 0 500 EAST (m)
Figure 33 – Failed avoidance detection or manoeuvres according to TABLE V and Figure 21. (a) top right – Missed Detection (b) top left – Late Alert (c) bottom left – Late Alert (d) bottom right – Late Alert on a repeated manoeuvre.
5.3.4 Performance Results 5.3.4.1 Standard Operating Characteristics Figure 34 shows the SOC curves using PSA and PUA of Equations (5.3) and (5.4) from [115]. Alternatively, results that implement PCD and PFM of Equations (5.6) and (5.7), from the original [111] and others [36] are shown in Figure 35. Shown on
Shane Degen
Page 100 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
these plots is the line-of-little-benefit (red dashed line).
When the system is
operating below this line, more false alarms than satisfactory alerts are triggered.
Standard Operating Characteristic Curve 1 0.8
PSA
0.6 Ideal CAS
0.4
CAS
0.2 0
Relative Bearing No benefit 0
0.1
0.2
0.3
0.4
0.5 PUA
0.6
0.7
0.8
0.9
1
0.8
0.9
1
Standard Operating Characteristic Curve 1
PSA
0.95
0.9
0.85
0.8
0
0.1
0.2
0.3
0.4
0.5 PUA
0.6
0.7
Figure 34 – Standard Operating Characteristics (SOC) curve for CAS
As a basis for comparison, we have also shown a system that operates solely on relative bearing rate ( ) – green dotted line. Also, seen in Figure 34, is that Induced Conflicts increase as PUA increases, because PSA tends away from unity. This is not reflected on the PCD / PFM plots in Figure 35. This is the main reason for using a SOC curve with PSA and PUA .
Shane Degen
Page 101 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Standard Operating Characteristic Curve 1 0.8
PCD
0.6 Ideal CAS
0.4
CAS
0.2 0
Relative Bearing No benefit 0
0.1
0.2
0.3
0.4
0.5 PFM
0.6
0.7
0.8
0.9
1
0.8
0.9
1
Standard Operating Characteristic Curve 1
PCD
0.95
0.9
0.85
0.8
0
0.1
0.2
0.3
0.4
0.5 PFM
0.6
0.7
Figure 35 – SOC curve that displays original PCD and PFM
5.3.4.2 Risk Ratio Displayed in Figure 36 are the results for RiskRatio (in blue solid lines) at various thresholds (on the x-axis) for both C AS (top) and from the relative bearing experiment (bottom). The Induced Conflict component of the Risk Ratio RRIC from Equation (5.12) is shown as the red dash-dot line. From the top diagram of Figure 36 one can see that 10 has the lowest RiskRatio . This is why it is chosen as the threshold for ELOS analysis below in Section 5.3.4.3. From Figure 36, one can see that at the chosen C AS threshold of 10, a RiskRatio 1.266% with
Shane Degen
Page 102 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
RRIC 0.596% is produced. Historically 10% of NMACs lead to MACs [123], which can be seen in the PMAC and PNMAC Eurocontrol use [112]. Risk Ratio for CAS 0.5 RiskRatio RRIC
RiskRatio
0.4 0.3 0.2 0.1 0
0
2
4
6
8 10 CAS test statistic
12
14
16
Risk Ratio for Relative Bearing 0.5
RiskRatio
0.4 0.3 0.2 0.1 0
0
10
20
30
40
50
60
70
80
90
100
Test statistic (x 10-4)
Figure 36 – Risk Ratio results for CAS
When compared with the TCAS results for uncooperative scenarios (23%) [112], these results are an order of magnitude better, however there is no real comparison between TCAS and this CAS, as TCAS is tested using manoeuvring intruders in a comprehensive encounter model (as discussed in Section 5.1.1); whereas for our CAS, a simplified encounter model is used.
Also noteworthy is that TCAS is
designed in particular for the cooperative domain, whereas this CAS is designed specifically for the uncooperative scenarios, therefore comparison is not relevant.
Shane Degen
Page 103 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Even though CAS 10 has a high PFM rate of 27.35% (oversensitivity), manoeuvring one out of four times unnecessarily is deemed reasonable to get the high level of safety (low RiskRatio ).
5.3.4.3 ELOS Expectations For computing ELOS expectations requirements, we determined the total Risk Ratio for a MAC is RRMAC 1.27 103 . This comes from the assumption that 10% of NMACs lead to MACs [112, 123].
The conservative figures released by
Dalamagkidis et al. [28, 124] for RRMAC state that a CAS would need to meet the manned aviation ELOS, which was said to be around RRMAC 1103 , from Equation (5.16), for Class E airspace. (Remembering that there are no figures for Class G airspace). Thus, one can see that the figures of the collision avoidance system of this research are in the same order of magnitude. It is therefore reasonable to claim that this collision avoidance system has results that are comparable to current ELOS expectations for operations in Class E airspace.
Shane Degen
Page 104 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
6 Conclusion As previously discussed in Section 1.1, the collision avoidance problem is one of the major hurdles to allowing UAS to operate freely in the NAS and thus ensure continued UAS market growth [8]. Various major players within industry have made a reasonable attempt at solving the collision avoidance problem [38-46]. However, the system industry proposes uses a sensor that costs around $200k, uses a lot of power and is heavy [31]. A system of this magnitude is currently unreasonable for low-cost UAS. A collision avoidance system using vision-only sensors would present a solution for the low-cost UAS market and be a major technological enabler for the entire UAS sector [29].
This thesis has presented a well-defined methodology for a
collision avoidance algorithm that uses vision-only data to negotiate a conflict scenario without calculating range. The fact that no range estimates are made means that action is able to take place almost immediately (within ~0.12 seconds) which is orders of magnitude faster than its rival systems [25, 30, 34] and thus improves the overall safety of the collision avoidance system (CAS). In this thesis, we investigated the intruder‟s characteristics in an image that directly affect the miss distance in a conflict scenario: namely intruder image velocity ; and time to collision TTC , which is derived from intruder‟s angular-area subtended and its rate (known as image area expansion). These image-based characteristics are implemented in a CAS that uses a test statistic in a thresholding approach to detect a conflict, C AS . This algorithm also uses these image-based characteristics to manoeuvre the owncraft to avoid the collision. The objective of the developed CAS is to be able to avoid a mid-air collision (MAC) and a near mid air collision (NMAC). That means for successful collision avoidance the two aircraft need to exceed a closest point of approach (CPA) distance of 152.4 metres (500ft) laterally.
Shane Degen
Page 105 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
We mathematically modelled the UAS platform (Flamingo) that the proposed CAS is tested on, along with the UAS controller and the CAS vision sensor that is motion-compensated, using feedback from the owncraft inertial instruments. We outlined the architecture of the simulation environment used to test the proposed CAS. Finally, a simplified encounter model is implemented in a Monte-Carlo simulation that is used to simulate NMACs, defined to have a CPA distance of less than 152.4m.
These simulations are run 50 000 times at various test statistic
thresholds. The developed CAS is gauged against a system that uses only intruder image velocity in a similar thresholding approach. The results are displayed in a standard operating characteristic (SOC) curve for both CAS‟s over a range of test statistics. These SOC curves display correct detection performance against false alarm rate. Established is that the CAS of this thesis which utilises TTC performs much better than a system that uses only. Also shown is that for Class E airspace the published probability of a MAC ELOS expectation is RRMAC 103 . The figure that this CAS produced in simulation experiments is RRMAC 1.27 103 . This is in the same order of magnitude. Thus, it is reasonable to say that this CAS has results that are comparable with current ELOS expectations for Class E airspace.
Shane Degen
Page 106 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
7 Future Recommendations This research aimed to prove the feasibility of using vision-only data in a heavily regulated environment, such as aviation, for collision avoidance. We have shown that the results are favourable and comparable to the ELOS requirements such a system needs. From here, it is recommended to expand the research to include collision determination in the second image dimension (of elevation ) and thus include manoeuvring in the third dimension (altitude). Because the NMAC defined zone for the vertical plane is only 30.5m and the responsiveness of an owncraft in the vertical plane is often quicker, it is reasonable to expect even better results. Next one would include manoeuvring intruders (on constant curves); however, more encounter modelling also needs to be completed in this area. Alternately running this algorithm on one of the noted comprehensive models would go towards showing realisable results. After this, multiple intruder collision avoidance would be the next logical step. As well as having the more comprehensive encounter model, it would be good to get flight test results, as other unforeseeable problems may need addressing.
Shane Degen
Page 107 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
This page is intentionally left blank.
Shane Degen
Page 108 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
8 Appendices APPENDIX A Data and aerodynamic coefficients of the Flamingo UAS [125] FLAMINGO DATA
TABLE VI
Variable
Flamingo Values Symbol
Value
Units
Mass
m
20
kg
Mean chord
c
0.29
m
Surface area
S
1.15
m2
Wingspan
b
4.0
m
CoG
0.25
c
2415
NACA
Centre of gravity Airfoil
Flamingo Limits FLAMINGO LIMITS
TABLE VII
Variable
Flamingo Limits Symbol
Value
Units
Maximum thrust
Tmax
24.5
N
Thrust slew limit
Tmax
5
N/k
Max angle of attack
max
16
˚
Stall speed
Vstall
13
m/s
VT
27
m/s
Maximum speed
Vmax
40
m/s
Climb rate
hmax
92
m/min
Max stable roll angle
max
20
˚
Max heading rate
max
3.75
˚/s
Typical operation speed
Shane Degen
Page 109 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Inertial Data TABLE VIII
INERTIAL DATA
Inertial Values (N∙m) Axis Value
IX
5.0
IY
6.28
IY
9.18
I ZX
0
Lift/Drag Data TABLE IX
LIFT/DRAG DATA
Lift/Drag Values Coefficient Value CL0
0.04
CL
6.0
CLq
7.729
CD0
0.02
kk
0.0039
Longitudinal Coefficients TABLE X
LONGITUDINAL COEFFICIENTS
Longitudinal Values Coefficient Value
Shane Degen
Cm0
-0.055
Cm
-0.85
Cme
-1.571
Cmq
-41.3
Cm
-10.7
Page 110 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Lateral Coefficients TABLE XI
LATERAL COEFFICIENTS
Lateral Values Coefficient Value CY -0.308 CYr
0.2
CYp
0.0
CYr
0.588
Cl
-0.089
Clr
0.015
Cla
0.177
Cl p
-0.6
Clr
(CL/3.5)-0.063
Cn
0.038
Cn
0.0
Cnr
-0.055
Cna
-0.0354*CL
Cn p
-0.032
Cnr
-1.157
Mach Coefficients TABLE XII
MACH COEFFICIENTS
Mach Values Coefficient Value
Shane Degen
Cmu
0
CLu
0
CDu
0
Page 111 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Control Coefficients TABLE XIII
CONTROL COEFFICIENTS
Control Values Loop
Symbol
Value
Heading hold using roll
P
-1
Aileron from roll
Pail
-0.2
Pel
-0.01
I el
-0.00005
Del
-1
Pff rud
-10
Pth
3
I th
0.008
Altitude hold using elevator
Rudder from sideslip (coord. turns) Throttle for speed hold
Shane Degen
Page 112 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
APPENDIX B Flamingo Open-Loop Stability This open loop stability analysis is developed using Nelson [124] with q as 446.4886 Pa, Temperature at 15˚C and altititude of sea level. Also, take CL as 0.4 for typical operations.
Lateral Stability Lateral-directional Derivatives qSbCl
L
IX
qSb 2Cnr
Nr Lr
2 I ZVt qSb 2Clr 2 I X Vt qSbCn
N
Lp
IZ
qSb 2Cl p
Y
Yr
Shane Degen
2 I X Vt qSCY m
qSbCYr 2mVt
(8.1)
(8.2)
(8.3)
(8.4)
(8.5)
(8.6) (8.7)
Page 113 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
Spiral approximation spiral
L N r Lr N L
qSb 2 Cl Cnr Clr Cn 2 I ZVt Cl
(8.8)
446.4886 1.15 42 0.089 1.157 0.05129 0.038 2 9.18 27 0.089 18.81
Roll approximation roll L p
qSb 2Cl p 2 I X Vt
qSb 2Cl p
(8.9)
2 I X Vt
446.4886 1.15 42 0.6 2 5 27 18.26
Dutch Roll approximation nDR
Y N r NYr Vt N Vt
qSb qSb Cnr CY Cn CYr 2mCn 2mI ZVt
2
446.4886 4.6 446.4886 4.6 1.157 0.308 0.038 0.588 40 0.038 2 20 9.18 27 2 2053.848 2053.848 0.35636 0.02231 1.52 267688.8
2.30 (8.10)
Shane Degen
Page 114 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
DR
1 Y Vt N r 2nDR Vt
2 1 qS C y b Cnr 2nDR Vt m 2I Z
0.308 42 1.157 1 446.489 1.15 2 2.37 27 2 9.18 20
(8.11)
4.11
Lateral Flying Qualities Spiral This represents awesome roll characteristics (Level 1) according to table 5.5 of Nelson. spiral is the characteristic root due to spiral mode. spiral 18.81
(8.12)
Roll This represents awesome roll characteristics (Level 1) according to table 5.5 of Nelson. Where roll is the roll time constant and roll is the characteristic root due to roll.
roll 18.26 roll
1 roll
(8.13)
roll 0.0547
Dutch Roll The Dutch Roll characteristics represent very good or Level 1 according to Nelson table 5.6. Where nDR is the undamped natural frequency and DR is the damping ratio due to Dutch Roll.
Shane Degen
Page 115 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
nDR 2.37 DR 4.11
(8.14)
Longitudinal Stability Longitudinal Derivatives Zu Xu
Z
(CLu 2CL0 )qS mVt (CDu 2CD0 )qS mVt
(CL CD0 )qS
M q Cmq
m
(8.16)
(8.17)
c qSc 2Vt IY
(8.18)
qSc IY
(8.19)
c qSc 2Vt IY
(8.20)
M Cm
M Cm
(8.15)
Phugoid mode nP
Zu g Vt qSg (CLu 2CL0 ) mVt 2
(8.21)
446.4886 1.15 9.80665 2 0.04 20 27 2 0.1662
Shane Degen
Page 116 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
P
Xu 2nP qS (CDu 2CD0 ) 2nP mVt
(8.22)
446.4886 1.15 2 0.02 2 0.1662 20 27 0.1144
Short Period mode nSP
Z M q Vt
M qSc (CL CD0 )Cmq C m 2mVt 2
qSc IY
446.4886 1.15 0.29 446.4886 1.15 0.29 6.02 41.3 0.85 2 6.28 2 20 27
(8.23)
7.09
Mq M SP
Z u0
2nSP
2 qS CL CD0 c Cmq Cm 2nSP Vt m 2 IY 2 446.4886 1.15 6 0.02 0.29 41.3 10.7 2 7.0892 27 20 2 6.28 0.22
(8.24)
Longitudinal Flying Qualities Phugoid These phugoidal characteristics are considered level 1 or good (according to Nelson table 4.10). Where nP is the undamped natural frequency and P is the damping ratio due to phugoid.
Shane Degen
Page 117 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
nP 0.1662 P 0.1144
(8.25)
Short Period These short period characteristics are considered level 2 or acceptable (according to Nelson table 4.10). Where nSP is the undamped natural frequency and SP is the damping ratio in the Short Period.
nSP 7.09 SP 0.22
Shane Degen
(8.26)
Page 118 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
APPENDIX C Image Area Expansion An image pixel is considered to have approximately a square relationship (From Figure 8). Thus, the angular area subtended by the image is:
2
(8.27)
A d2
(8.28)
From Figure 8 we can let
Now if we assume (for small angles) that
d R
(8.29)
A R
(8.30)
A R2
(8.31)
Then
So,
Thus, A t t R 2
(8.32)
From quotient rule: u t v
v
u v u t t v2
(8.33)
Substituting (8.32) into (8.33) gives:
Shane Degen
Page 119 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
A R 2 R A t t 4 t R 2
(8.34)
However,
A 0 t
(8.35)
A R 2 . t R 4 t
(8.36)
So Equation (8.34) becomes,
Now substitute in Equation (8.31),
( R.R) . t R 2 t
(8.37)
v u u.v u v t t t
(8.38)
( R.R) R 2 R. t t
(8.39)
The product rule states,
So
And it is known that,
V
R t
(8.40)
Thus putting Equation (8.39) and (8.40) back into Equation (8.37) becomes,
.2 RV . t R 2
(8.41)
But we know
V
R T
(8.42)
Where T is TTC . Therefore putting Equation (8.42) back into Equation (8.41)
Shane Degen
Page 120 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
2 R 2 . t R 2 TTC
(8.43)
2 t
(8.44)
So finally, TTC
Shane Degen
Page 121 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
9 Bibliography [1]
[2] [3] [4]
[5] [6] [7] [8] [9]
[10]
[11]
[12]
[13] [14]
[15] [16] [17]
J. Baldwin, et al., "General aviation collision avoidance-challenges of full implementation," in Digital Avionics Systems Conference, 1994. 13th DASC., AIAA/IEEE, 1994, pp. 504-509. K. M. Caldwell, "TCAS II testing conflicts and resolutions," in 18th Proceedings of the ICAS Congress, Beijing, China, 1992, pp. 9-17. US-OSoD, Unmanned systems roadmap 2009-2034, Fourth ed. Washington, District of Columbia: Office of the Secretary of Defense, 2009. M. T. DeGarmo, "Issues concerning integration of unmanned aerial vehicles in civil airspace," MITRE, Center for Advanced Aviation System Development, McLean, Virginia, 2004. B. C. Meyer, "Notes on flying and dying," Psychoanalytic Quarterly, vol. 52, pp. 327-352, 07 1983. L. R. Newcombe, Unmanned aviation: a brief history of unmanned aerial vehicles: AIAA, 2004. Teal, World unmanned aerial vehicle systems - market profile and forecast, 2010. DoD. (2009, 16/02/2009). Due regard technology for unmanned aerial systems. R. E. Weibel and R. J. Hansman, "Safety considerations for operation of unmanned aerial vehicles in the national airspace system," MIT International Center for Air Transportation, Cambridge, Massachusetts2005. K. Dalamagkidis, et al., "On unmanned aircraft systems issues, challenges and operational restrictions preventing integration into the national airspace system," Progress in Aerospace Sciences, vol. 44, pp. 503-519, 2008. A. D. Zeitlin, "Technology milestones–detect, sense & avoid for unmanned aircraft systems," in Proceedings of the AIAA Infotech@Aerospace Conference and Exhibit, Rohnert Park, California, 2007. JAPCC, "The Joint Air Power Competence Centre (JAPCC) flight plan for unmanned aircraft systems (UAS) in NATO," Joint Air Power Competence Centre15/03/2007 2007. CARE, "Integration of UAVs into future air traffic management," Cooperative Actions of R&D in Eurocontrol2001. S. Attila, "Technology demonstration study on sense and avoid technologies for long endurance unmanned aerial vehicles," European Defense Agency, Brussels, Belgium2007. Air4All, "Air4All workshop 1," ed, 2008. ASTM Standards, F2411-07 - Standard specification for design and performance of an airborne sense-and-avoid system, 2007. W. E. Green and P. Y. Oh, "Optic-flow-based collision avoidance," IEEE Robotics & Automation Magazine, vol. 15, pp. 96-103, 2008.
Shane Degen
Page 122 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
[18]
[19]
[20] [21]
[22] [23]
[24]
[25]
[26]
[27] [28]
[29]
[30]
[31] [32] [33]
E. S. Jang, et al., "Collision avoidance of a mobile robot for moving obstacles based on impedance force control algorithm," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems IROS, Edmonton, Canada, 2005, pp. 382-387. S. C. Han and H. Bang, "Proportional navigation-based optimal collision avoidance for UAVs," in Proceedings of the 2nd International Conference on Autonomous Robots and Agents, Palmerston North, New Zealand, 2004. J. M. Hoekstra, et al., "Free flight in a crowded airspace," Progress in Astronautics and Aeronautics, pp. 533-545, 2001. A. Bicchi and L. Pallottino, "On optimal cooperative conflict resolution for air traffic management systems," IEEE Transactions on Intelligent Transportation Systems, vol. 1, pp. 221-231, 2000. B. Wetherby, et al., Full AERA services operational description. McLean, Virginia: The MITRE Corporation, 1993. H. Erzberger, "The automated airspace concept," presented at the Proceedings of the 4th USA/Europe Air Traffic Management R&D Seminar, Santa Fe, New Mexico, 2001. A. R. Lacher, et al., "Unmanned aircraft collision avoidance – technology assessment and evaluation methods," in Proceedings of The 7th Air Traffic Management Research & Development Seminar Barcelona, Spain, 2007. O. Shakernia, et al., "Passive ranging for UAV sense and avoid applications," in Proceedings of the AIAA Infotech@Aerospace Conference and Exhibit, Arlington, Virginia, 2005, pp. 1-10. G. Fasano, "Multisensor based fully autonomous non-cooperative collision avoidance system for UAVs," PhD Thesis PhD Thesis, Aerospace Systems Research Group, University of Naples, Naples, Itlay, 2008. SKYbrary. (2010, 6/4/2010). Near mid air collision (NMAC). Available: http://www.skybrary.aero/index.php/NMAC K. Dalamagkidis, et al., On integrating unmanned aircraft systems into the national airspace system: issues, challenges, operational restrictions, certification, and recommendations: Springer Verlag, 2009. B. C. Karhoff, et al., "Eyes in the domestic sky: an assessment of sense and avoid technology for the army's "Warrior" unmanned aerial vehicle," in IEEE Systems and Information Engineering Design Symposium, 2006, pp. 36-42. H. Voos, "UAV "see and avoid" with nonlinear filtering and non-cooperative avoidance," in Proceedings of the 13th IASTED International Conference Robotics and Applications, Wurzburg, Germany, 2007. W.-Z. Chen, "Sense and avoid (SAA) technologies for unmanned aircraft (UA)," National Cheng Kung University2008. nVidia. (2009, 17/02/2009). GeForce GTX 280. Available: http://www.nvidia.com/object/product_geforce_gtx_280_us.html J. Lai and J. J. Ford, "Relative Entropy Rate based Multiple Hidden Markov Model Approximation," IEEE Transactions on Signal Processing, 2009(accepted to appear).
Shane Degen
Page 123 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
[34]
[35]
[36]
[37] [38] [39]
[40] [41]
[42]
[43]
[44]
[45]
[46]
[47]
[48]
[49]
E. W. Frew, "Observer trajectory generation for target-motion estimation using monocular vision," PhD Thesis PhD Thesis, Department of Aeronautics and Astronautics, Stanford University, San Francisco, California, 2003. A. J. Calise, et al., "Estimation and guidance strategies for vision based target tracking," in Proceedings of the American Control Conference ACC, Portland, Oregon, 2005, pp. 5079-5084 vol. 7. M. Kochenderfer, et al., "Hazard alerting using line-of-sight rate," in Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, Honolulu, Hawaii, 2008, pp. 2003-2011. B. Young. (2009, 10 December 2009). Silvertone Flamingo UAV. Available: http://www.silvertoneuav.com/flamingo.php R. C. Wolfe, "NASA ERAST non-cooperative DSA flight test," 2003. W. Graham and R. H. Orr, "Separation of air traffic by visual means: an estimate of the effectiveness of the see-and-avoid doctrine," Proceedings of the IEEE, vol. 58, pp. 337-361, 1970. Y. Ikeda, et al., "Automatic air collision avoidance system," in Proceedings of the 41st SICE Annual Conference, Osaka, Japan, 2002, pp. 630-635. Y. Ikeda and J. Kay, "An optimal control problem for automatic air collision avoidance," in Proceedings of the 42nd IEEE Conference on Decision and Control CDC, Maui, Hawaii, 2003, pp. 2222-2227. L. Matthies, et al., "Kalman filter-based algorithms for estimating depth from image sequences," International Journal of Computer Vision, vol. 3, pp. 209238, 1989. J. Utt, et al., "Test and Integration of a Detect and Avoid System," in Proceedings of the AIAA 3rd "Unmanned Unlimited" Technical Conference, Workshop and Exhibit, Chicago Illinois, 2004. J. Utt, et al., "Development of a sense and avoid system," in Proceedings of the AIAA Infotech@Aerospace Conference and Exhibit, Arlington, Virginia, 2005. K. R. Suwal, et al., "SeFAR integration test bed for see and avoid technologies," in Proceedings of the AIAA Infotech@Aerospace Conference and Exhibit, Arlington, Virginia, 2005, pp. 1-7. O. Shakernia, et al., "Sense and avoid (SAA) flight test and lessons learned," in Proceedings of the AIAA Infotech@Aerospace Conference and Exhibit, Rohnert Park, California, 2007. A. Richards and J. P. How, "Aircraft trajectory planning with collision avoidance using mixed integer linear programming (MILP)," in Proceedings of the American Control Conference ACC, Anchorage, Alaska, 2002, pp. 1936-1941. E. W. Frew, et al., "Adaptive receding horizon control for vision-based navigation of small unmanned aircraft," in Proceedings of the American Control Conference ACC, Minneapolis, Minnesota, 2006. A. Richards and J. How, "A decentralized algorithm for robust constrained model predictive control," in Proceedings of the American Control Conference ACC, Boston, Massachusetts, 2004, pp. 4261-4266 vol.5.
Shane Degen
Page 124 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
[50]
[51]
[52]
[53]
[54]
[55]
[56] [57]
[58] [59]
[60]
[61]
[62]
[63]
J. B. Froisy, "Model predictive control - Building a bridge between theory and practice," Computers and Chemical Engineering, vol. 30, pp. 1426-1435, 2006. L. Pallottino, et al., "Conflict resolution problems for air traffic management systems solved with mixed integer programming," IEEE Transactions on Intelligent Transportation Systems, vol. 3, pp. 3-11, 2002. D. H. Shim, et al., "Decentralized nonlinear model predictive control of multiple flying robots," in Proceedings of the 42nd IEEE Conference on Decision and Control CDC, Maui, Hawaii, 2003, pp. 3621-3626. E. W. Frew, "Receding horizon control using random search for UAV navigation with passive, non-cooperative sensing," in Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, San Francisco, California, 2005, pp. 1-13. E. W. Frew, "Approximating information content for active sensing tasks using the unscented transform," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems IROS, Nice, France, 2008, pp. 2559-2564. O. Khatib, "Real-time obstacle avoidance for manipulators and mobile robots," The International Journal of Robotics Research, vol. 5, pp. 90-98, March 1 1986. P. Corke, "Mobile robot navigation as a planar visual servoing problem," in Robotics Research. vol. 6, ed: Springer, 2003, pp. 361-372. J. Ren, et al., "Modified Newton's method applied to potential field-based navigation for nonholonomic robots in dynamic environments," Robotica, vol. 26, pp. 117-127, 2007. J. C. Latombe, Robot motion planning: Kluwer Academic Publishers, 1991. I. Ulrich and J. Borenstein, "VFH+: reliable obstacle avoidance for fast mobile robots," in Proceedings of the IEEE International Conference on Robotics and Automation ICRA, Leuven, Belgium, 1998, pp. 1572-1577. I. Ulrich and J. Borenstein, "VFH*: local obstacle avoidance with look-ahead verification," in Proceedings of the IEEE International Conference on Robotics and Automation ICRA, San Francisco, California, 2000, pp. 2505251. J. Borenstein and Y. Koren, "The vector field histogram-fast obstacle avoidance for mobile robots," IEEE Transactions on Robotics and Automation, vol. 7, pp. 278-288, 1991. Y. Koren and J. Borenstein, "Potential field methods and their inherent limitations for mobile robot navigation," in Proceedings of the IEEE International Conference on Robotics and Automation ICRA, Sacramento. California, 1991, pp. 1398-1404. G. K. Schmidt and K. Azarm, "Mobile robot navigation in a dynamic world using an unsteady diffusion equation strategy," in Proceedings of the lEEE/RSJ International Conference on Intelligent Robots and Systems IROS, Raleigh, North Carolina, 1992, pp. 642-647.
Shane Degen
Page 125 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
[64]
[65]
[66]
[67]
[68]
[69]
[70]
[71]
[72] [73] [74] [75] [76]
[77]
[78]
[79]
C. I. Connolly, et al., "Path planning using Laplace's equation," in Proceedings of the IEEE International Conference on Robotics and Automation ICRA, Cincinnati, Ohio, 1990, pp. 2102-2106. A. A. Masoud, et al., "Robot navigation using a pressure generated mechanical stress field: the biharmonic potential approach," in Proceedings of the IEEE International Conference on Robotics and Automation ICRA, San Diego, California, 1994, pp. 124-129 D. A. Lawrence, "Lyapunov vector fields for UAV flock coordination," in Proceedings of the AIAA 2nd "Unmanned Unlimited" Technical Conference, Workshop and Exhibit, San Diego, California, 2003. E. W. Frew, "Cooperative standoff tracking of uncertain moving targets using active robot networks," in Proceedings of the IEEE International Conference on Robotics and Automation ICRA, Rome, Italy, 2007, pp. 3277-3282. E. W. Frew and D. Lawrence, "Cooperative stand-off tracking of moving targets by a team of autonomous aircraft," in Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, San Francisco, California, 2005. S. R. Griffiths, "Vector field approach for curved path following for miniature aerial vehicles," in Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, Keystone, Colorado, 2006, pp. 63-64. E. W. Frew, et al., "Lyapunov guidance vector fields for unmanned aircraft applications," in Proceedings of the American Control Conference ACC, New York City, 2007, pp. 371-376. E. W. Frew, et al., "Coordinated standoff tracking of moving targets using Lyapunov guidance vector fields," AIAA Journal of Guidance, Control, and Dynamics, vol. 31, p. 290, 2008. S. Griffiths, et al., "Maximizing miniature aerial vehicles," IEEE Robotics & Automation Magazine, vol. 13, pp. 34-43, 2006. S. Griffiths, et al., "Obstacle and terrain avoidance for miniature aerial vehicles," IEEE Robotics & Automation Magazine, vol. 13, pp. 34-43, 2006. D. R. Nelson, et al., "Vector field path following for miniature air vehicles," IEEE Transactions on Robotics, vol. 23, pp. 519-529, 2007. Procerus. (2010, 12/05/2010). Kestral autpilot. Available: http://www.procerusuav.com/productsKestrelAutopilot.php S. C. Degen, et al., "Tensor field guidance for time-based waypoint arrival of UAVs by 4D trajectory generation," in Proceedings of the IEEE Aerospace Conference, Big Sky, Montana, 2009. K. Sigurd and J. How, "UAV trajectory design using total field collision avoidance," in Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, Austin, Texas, 2003. M. Massink and N. De Francesco, "Modelling free flight with collision avoidance," in Proceedings of the IEEE International Conference on Engineering of Complex Computer Systems, Skovde , Sweden, 2001, pp. 270279. RTCA, "Final report of the RTCA task force 3: free flight implementation," RTCA, Washington, District of Columbia1995.
Shane Degen
Page 126 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
[80]
[81]
[82]
[83]
[84]
[85]
[86] [87]
[88]
[89]
[90]
[91]
[92]
[93]
[94]
A. Chakravarthy and D. Ghose, "Obstacle avoidance in a dynamic environment: a collision cone approach," IEEE Transactions on Systems, Man and Cybernetics, Part A, vol. 28, pp. 562-574, 1998. Y. Watanabe, et al., "Vision-based obstacle avoidance for UAVs," in Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, Hilton Head, South Carolina, 2007, pp. 20-23. B. Ajith Kumar and D. Ghose, "Radar-assisted collision avoidance/guidance strategy for planar flight," IEEE Transactions on Aerospace and Electronic Systems, vol. 37, pp. 77-90, 2001. C. Carbone, et al., "A novel 3D geometric algorithm for aircraft autonomous collision avoidance," in Proceedings of the 45th IEEE Conference on Decision and Control CDC, San Diego, California, 2006, pp. 1580-1585. S. Arulampalam, et al., "Bearings-only tracking of manoeuvring targets using particle filters," Journal on Applied Signal Processing vol. 15, pp. 2351– 2365, 2004. T. Bréhard and J. P. Le Cadre, "Initialization of particle filter and posterior Cramér-Rao bound for bearings-only tracking in modified polar coordinate system," IEEE Transactions on Aerospace and Electronic Systems, 2004. Y. Yu and Q. Cheng, "Particle filters for maneuvering target tracking problem," Signal Processing, vol. 86, pp. 195-203, 2006. G. Lei, et al., "Posterior Cramer-Rao lower bounds for multitarget bearingsonly tracking," Journal of Systems Engineering and Electronics, vol. 19, pp. 1127-1132, 2008. V. Aidala and S. Hammel, "Utilization of modified polar coordinates for bearings-only tracking," IEEE Transactions on Automatic Control, vol. 28, pp. 283-294, 1983. S. Arulampalam and B. Ristic, "Comparison of the particle filter with rangeparameterized and modified polar EKFs for angle-only tracking," in Proceedings of the Signal and Data Processing of Small Targets 2000, Orlando, Florida, 2000, pp. 288-299. G. Recchia, et al., "An optical flow based electro-optical see-and-avoid system for UAVs," in Proceedings of the IEEE Aerospace Conference, Big Sky, Montana, 2007, pp. 1-9. B. Call, et al., "Obstacle avoidance for unmanned air vehicles using image feature tracking," in Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, Keystone, Colorado, 2006. P. J. Shelnutt, "Collision avoidance for UAVs using optic flow measurement with line of sight rate equalization and looming," MSc Thesis, Air Force Institute of Technology, Air University, Wright-Patterson Air Force Base, Ohio, 2008. P. Angelov, et al., "A passive approach to autonomous collision detection and avoidance," in Proceedings of the Tenth International Conference on Computer Modeling and Simulation, Cambridge, United Kingdom, 2008, pp. 64-69. E. W. Frew and S. M. Rock, "Trajectory generation for constant velocity target motion estimation using monocular vision," in Proceedings of the
Shane Degen
Page 127 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
[95]
[96] [97]
[98] [99] [100] [101]
[102] [103] [104] [105]
[106] [107]
[108] [109]
[110]
[111] [112] [113]
IEEE International Conference on Robotics and Automation ICRA Taipei, Taiwan, 2003, pp. 3479-3484 vol.3. A. Logothetis, et al., "An information theoretic approach to observer path design for bearings-only tracking," in Proceedings of the 36th IEEE Conference on Decision and Control, CDC, San Diego, California, 1997, pp. 3132-3137 vol.4. F. Chaumette and S. Hutchinson, "Visual servo control. I. Basic approaches [Tutorial]," IEEE Robotics & Automation Magazine, vol. 13, pp. 82-90, 2006. F. Chaumette and S. Hutchinson, "Visual servo control. II. Advanced approaches [Tutorial]," IEEE Robotics & Automation Magazine, vol. 14, pp. 109-118, 2007. D. Regan and R. Gray, "Visually guided collision avoidance and collision achievement," Trends in Cognitive Sciences, vol. 4, pp. 99-107, 2000. ATSB, "Limitations of the see and avoid principle," Canberra, Australia2004. F. Hoyle, The black cloud: Penguin, 1957. D. Sislak, et al., "Negotiation-based approach to UAVs," in IEEE Workshop on Distributed Intelligent Systems: Collective Intelligence and Its Applications, DIS, Prague, Czech Republic, 2006, pp. 279-284. Chapter 3 - Section 3.2: Right of way, 2009. Subpart B - Section 91.113: Right-of-way rules; Except water operations, 2004. Division 1 - Regulation 162: Rules for prevention of collision, 1988. F. R. Garza and E. A. Morelli, "A collection of nonlinear aircraft simulations in matlab," National Aeronautics and Space Administration, NASA/TM-2003212145, 2003. B. L. Stevens and F. L. Lewis, Aircraft control and simulation. Georgia Tech Research Institute/University of Texas: Wiley-Interscience, 1992, p 9. N. Cowan, et al., "Vision-based follow-the-leader," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems IROS, Las Vegas, Nevada, 2003, pp. 1796-1801 vol.2. FAA, "P-8740-51 - How to avoid a mid air collision," 1987. L. Mejias, et al., "Towards implementation of vision-based UAS sense and avoid systems," in International Council of the Aeronautical Sciences ICAS, Nice, France, 2010. O. Bourquardez and F. Chaumette, "Visual servoing of an airplane for alignment with respect to a runway," in Proceedings of the IEEE International Conference on Robotics and Automation ICRA, Rome, Italy, 2007, pp. 1330-1335. J. Kuchar, "Methodology for alerting-system performance evaluation," AIAA Journal of Guidance, Control, and Dynamics, vol. 19, pp. 438-444, 1996. Eurocontrol, "Final report on studies on the safety of ACAS II in Europe," ACAS/ACASA/02-014, 2002. M. J. Kochenderfer, et al., "A comprehensive aircraft encounter model of the National Airspace System," Lincoln Laboratory Journal, vol. 17, pp. 41-53, 2008.
Shane Degen
Page 128 of 129
Reactive Image-based Collision Avoidance System for Unmanned Aircraft Systems
[114] M. Kochenderfer, et al., "Uncorrelated encounter model of the National Airspace System version 1.0," Massachusetts Inst Of Tech Lexington, Lincoln Lab,2008. [115] M. J. Kochenderfer, et al., "Model-based optimization of airborne collision avoidance logic," Massachusetts Inst Of Tech Lexington, Lincoln Lab, 2010. [116] M. Barkat, Signal detection and estimation: Artech House Publishers, 2005. [117] L. Winder and J. Kuchar, "Evaluation of collision avoidance maneuvers for parallel approach," AIAA Journal of Guidance, Control, and Dynamics, vol. 22, pp. 801-807, 1999. [118] Eurocontrol, "Final report on the safety of ACAS II in the European RVSM environment," ASARP/WP9/72/D, 2006. [119] J. Kuchar and A. Drumm, "The Traffic Alert and Collision Avoidance System," Lincoln Laboratory Journal, vol. 16, pp. 277-296, 2007. [120] T. Hutchings, et al., "Architecting UAV sense & avoid systems," in Proceedings of the IET Conference on Autonomous Systems, London, United Kingdom, 2007, pp. 1-8. [121] S. Endoh, "Aircraft collision models," MSc Thesis, Department of Aeronautics and Astronautics, Massachusetts Institute of Technology, Cambridge, Massachusetts, 1982. [122] Number 90-48C, 1983. [123] M. Kochenderfer, et al., "Airspace encounter models for estimating collision risk," AIAA Journal of Guidance, Control, and Dynamics, vol. 33, 2010. [124] R. C. Nelson, Flight stability and automatic control, 2nd ed. University of Notre Dame: McGraw Hill, 1998. [125] G. Bonin, "Flamingo model data," Riverwood, NSW, 2009.
Shane Degen
Page 129 of 129