OTEC 2015 Conference October 27–30, 2015, Columbus, Ohio
Practical Aspects of UAS-based Point Cloud Generation Greg Jozkow Acknowledgement to Charles Toth
Introduction Geospatial data acquisition (GIS) with small UAS (sUAS) Advantages Inexpensive platform Lightweight vehicle and sensors Easy deployment and operation Minimal logistics Relatively high resolution data Challenges Limited payload Limited operating range Short flight time Modest/low sensor performance Accurate georeferencing (e.g. direct georeferencing needed for LiDAR)
2
Typical Workflow for UAS‐based GIS Data Present practice: Data acquisition includes Mainly optical image capture Limited logging of georeferencing data Sensor orientation is based on image block adjustment; primarily using ground control and rarely air control Dense image matching is used to generate point cloud Point cloud processing includes filtering, DEM generation, segmentation Orthoimage and mosaic generation Quality control, accuracy assessment Ongoing developments: Improving sensor technologies; e.g., UAS‐specific LiDAR or HSI camera More georeferencing options; e.g., dual‐frequency GPS, tactical grade IMU Research topics and directions: Impact of GPS/IMU on point cloud generation Photogrammetrically derived point cloud vs. LiDAR point cloud 3
UAS Point Clouds Laser scanning
Dense image matching
4
UAS Point Clouds Semi‐Global Matching (SGM), pixel‐based matching, based on the disparity optimization
First term: matching cost pixelwise Second term: penalty for all pixels with different disparity (D) Matching cost was MI, now is Census (Hamming distance) Fast implementation (GPU, FPGA)
H. Hirschmuller, Semi‐Global Matching – Motivation, Developments and Applications, PhoWo 2011, Ed. D. Fritsch
5
Georeferencing/Navigation Alternatives
(a) Direct georeferencing
(b) Indirect georeferencing
Imaging is used for mapping, reconnaissance, etc.
Imaging is used for orientation first Imaging is used for TRN and (past practice) and SLAM orientation first (SLAM), and then for (presently) and then for mapping, mapping, reconnaissance, etc. reconnaissance, etc.
UAS practice
(c) Integrated sensor orientation (ISO)
6
Direct vs. Indirect Georeferencing Direct georeferencing (high‐accuracy) Carrier‐phase differential GPS (cm‐level accuracy) Medium‐grade (tactical) IMU (3‐15°/h gyro) Optional sensors (magnetometer, barometer, etc.) Time synchronization of sensors is needed Sensor and intra‐sensor calibration, such as boresighting, are important Indirect georeferencing Image overlap requirements Image feature extraction and matching Bundle block adjustment (AT, aerial triangulation) Ground control is needed (GPS‐surveyed natural and/or signalized targets) Camera self‐calibration capability ISO georeferencing Combined benefits of both methods, platform navigation data can be used as air control; provides most robust solution 7
Error Characterization Direct orientation Error propagation has an extrapolation character IO errors are uncompensated (!) Larger variances of reconstructed points Indirect orientation (integrated sensor orientation) Error propagation has an interpolation character Interior orientation errors are absorbed by AT Platform orientation accuracy is not of interest Wrong IO and AT => optimal EO reconstruction 8
Error Characterization
9
Test Configuration: Navigation Sensors Epson IMU M‐G362PDC1
Garmin GPS‐18LV L1 GPS (Velodyne configuration)
Antcom L1/L2 antenna NovAtel OEM615 GPS
Wookong‐M autopilot: L1 GPS, MEMS IMU, magnetometer (platform navigation)
Gigabyte GB‐BXi7‐4500 PC (data recording)
Solmeta Geotagger N3 L1 GPS (Nikon configuration)
MicroStrain IMU 3DMGX3‐35 with: L1 GPS, magnetometer 10
Small Navigation UAS Sensors: GPS
Dual‐frequency GPS Event input in PPS out Single Point L1 RMS Single Point L1/L2 RMS Time accuracy Power requirements Weight without / with equipment
NovAtel OEM615 Yes Yes 1.5 m 1.2 m 20 ns 1 W 24 gram 300 gram
Single‐frequency GPS (autopilot systems) No raw data No timing Horizontal position accuracy (without aiding) Time pulse accuracy Power requirements Weight without / with equipment
u‐blox LEA‐6H
2.5 m 30 ns 121 mW 2 gram 17 gram 11
Small Navigation UAS Sensors: IMU
IMU Gyro bias Gyro random walk Accelerometer bias
Epson M‐G362PDC1 3 °/ h N/A 40 mg
Accelerometer noise 40 mg / Hz1/2 Power requirements Weight without / with equipment
30 mA via USB 7 gram 30 gram
MicroStrain 3DM‐GX3‐35 18 °/ h 0.1 °/ h1/2 15 °/ h N/A > 60 mg > 250 g / Hz1/2 > 4 mA (IMU only)
Analog Devices ADIS16364 25 °/ h 2 °/ h1/2 8 mg 270 g / Hz1/2 49 mA 16 gram
> 50 gram 12
Test Configuration: Imaging Sensors Nikon D800 Full frame 36 Mpix DSLR camera Nikon Nikkor AF‐S 50 mm f/1.4G lens 1 s triggering (1 FPS) Fixed to the platform in nadir facing direction GoPro Hero 3+ Black Edition 12 Mpix Non‐stock 5.4 mm lens 0.5 s triggering (2 FPS) Gyro stabilized gimbal (vertical images) Velodyne HDL‐32E Usable returns up to 70 m 32 laser diodes 700,000 pints/s FOV: + 30°(front) to ‐ 10°(rear) @ 360° 13
Test Flights: Topographical Mapping
1 4
2
3 Site Flying height [m] Flying speed [m/s] No. of flight lines GSD [mm] Nikon/GoPro Velodyne flight No of GCPs
1 2 3 4 120 125 125 25 4 4 4 4 2 2 2 3 12 12 12 2.5 ‐ ‐ ‐ 7 ‐ ‐ ‐ + 31 0 15 19 14
Results: GPS Receivers’ Performance at Site 4 Positioning accuracy [m] Geotagger
MicroStrain
NovAtel
Indirect georeferencing (Nikon images)
Positioning method
Single Point (code)
Single Point (code)
Differential (carrier‐phase)
Image bundle block adjustment based on 15 accurate GCPs
Accuracy
According specification
Computed by receiver
Computed in RTKlib
Computed from residuals at GCPs
Horizontal
3.0
4.6
0.029
0.016
Vertical
N/A
5.6
0.046
0.018
3D
N/A
7.2
0.054
0.025
Indirect georeferencing solution (AT) was used subsequently as reference (note image center is not constrained in AT) 15
Results: Direct Georeferencing at Site 4
Horizontal
Vertical
Position RMSE [m] 3D
A
5.1
10.2
11.4
B
8.6
16.5
18.6
C
0.025 0.035 0.043
A – Geotagger B – MicroStrain C – NovAtel Reference (AT)
16
Results: Point Cloud Generation from Images at Site 4 Nikon
108 images (more Mpix) lower FOV (smaller area, more gaps) ~ 2.5M points (higher density) lower noise
GoPro
131 images (less Mpix) larger FOV (larger area, less gaps) ~ 0.6M points (lower density) high noise
3D RMSE 0.07 m
3D RMSE 0.77 m
17
Results: Nikon vs. GoPro, Site 4
18
Results: GoPro Image Limitations at Site 4
Strange image distortion caused by rolling shutter impossible to model additional blur Resulting in poor AT, (3D RMSE: 0.42 m) noisy and poor point cloud (3D RMSE: 0.77m)
19
Results: Point Cloud Generation by Dense Matching at Site 1
A
B
Software package Average point spacing [m] Average density [points/m2] 3D RMSE [m]
C
A 0.03 846 0.09
B 0.01 9999 0.13
C 0.03 593 0.52
20
Results: Point Cloud Generation from LiDAR Data at Site 4 Intensity
160
Point cloud density [points/m2] Nikon 12,429 GoPro 7,175 Velodyne 867
128 96 64
298.0 m 295.0 m
Elevation
10 m
32 0
292.5 m 290.0 m 287.5 m 285.0 m 283.0 m
Ongoing work, georeferencing and calibration issues are not yet finished 21
Cross Comparison and Status Point cloud generation performance w.r.t. georeferencing accuracy and image sensor quality GCP GPS GoPro Nikon Velodyne
GPS/MEMS IMU
Carrier‐based GPS/MEMS IMU
Density
Yes
Good1
Good1
Good1
High
No
Fair2
Fair2
Good1
High
Yes
Excellent
Excellent
Excellent
High
No
Fair3
Fair3
Excellent
High
No
Good/Excellent (not yet validated) Medium
N/A No
1) Point cloud accuracy limited by imaging sensor performance 2) Poor absolute accuracy and fair relative accuracy 3) Poor absolute accuracy and good relative accuracy
22
UAS Point Cloud Generation in non Topographic Applications UAS images GSD 5 mm (on wires) 10 pix across the wire Flying height 55 m over wires
Very high resolution Accurate georeferencing Features on wires Dense matching – point cloud (similarity to LiDAR) 23
3D Geometry Extraction: Workflow
Data acquisition
Hi‐res images High overlap GCPs and/or high quality navigation data
Point cloud filtering
Image block adjustment
Variety of methods, e.g. SGM Few software packages tested
AT Bundle adjustment
Point cloud segmentation / wires detection
Groups of points belonging to single wire and sag Semi‐manual in this investigation
Unwanted points removal
Image dense matching (point cloud)
3D geometry extraction (catenary fitting)
Horizontally – line Vertically – catenary Robust LS 24
Test Object Single section of transmission line between two poles Two levels of wires 2 wires at upper level U 3 wires at lower level L
Expected different azimuth for U2 wire due to different mount 25
Test Data Acquisition Image param.
Ground
Wire
GSD [mm]
8
5‐6
Endlap [%]
87
81‐84
Sidelap [%]
69
55‐61
Camera Nikon D800 (full‐frame 36Mpix DSLR) Nikon Nikkor AF‐S 50 mm f/1.4G 1 FPS
10 GCPs GPS‐RTK 3D RMSE 1.5 cm 26
Point Cloud Filtering and Segmentation
Number of points L1
L2
L3
U1
U2
46,021
25,423
35,247
17,415
3,259
27
Results: Fitted Catenaries
Catenaries fitting 3D RMSE = 10 cm
28
Conclusions Point clouds can be easily generated from UAS imagery; they are less/not dependent on the accuracy of platform direct georeferencing Point cloud quality and accuracy are strongly dependent on the camera geometrical properties and image scale (cm‐ and dm‐level accuracy) as well as the matching method/implementation used UAS point clouds have potential in non topographic applications Typically used single‐frequency GPS receivers (single solution) are not suitable for UAS georeferencing for precise mapping purposes (m level accuracy) Miniaturized dual‐frequency GPS receivers (differential solution) are fit for direct georeferencing of UAS (cm level accuracy) Using ground control and/or accurate air control provides the highest orientation accuracy for all sensors More investigation on MEMS IMU is needed on UAS (advancing technology) More work on high accuracy GPS/MEMS IMU solution is required to assess LiDAR point cloud performance and compare it to image derived point clouds 29
THANK YOU!
30