Annotation-Based Rescue Assistance System for Teleoperated Unmanned Helicopter with Wearable Augmented Reality Environment

Annotation-Based Rescue Assistance System for Teleoperated Unmanned Helicopter with Wearable Augmented Reality Environment Masanao KOEDA, Yoshio MATSU...
1 downloads 4 Views 3MB Size
Annotation-Based Rescue Assistance System for Teleoperated Unmanned Helicopter with Wearable Augmented Reality Environment Masanao KOEDA, Yoshio MATSUMOTO, Tsukasa OGASAWARA Nara Institute of Science and Technology Graduate School of Information Science, Robotics Lab. 8916-5 Takayama, Ikoma, Nara, 630-0192 Japan. Email: {masana-k, yoshio, ogasawar}@is.naist.jp Abstract In this paper, we introduce an annotationbased rescue assistance system for a teleoperated unmanned helicopter with an wearable augmented reality(AR) environment. In this system, an operator controls the helicopter remotely while watching an annotated view from the helicopter through a head mounted display(HMD) with a laptop PC in a backpack. Virtual Buildings and textual annotations assist the rescue operation indicating the position to search rapidly and intensively. The position and the attitude of the helicopter is measured by a GPS and a gyroscope, and sent to the operator’s PC via a

Fig. 1. Teleoperated Unmmand Helicopter

wireless LAN. Using this system, we conducted experiments to find persons and verified the feasibility.

I.. Introduction Unmanned helicopters are currently used for various purposes, such as crop-dusting and re-

Fig. 2. Operator Wearing Wearable Augmented Reality Environment

Helicopter USB-Serial Converter

GPS PCMCIA

Wireless LAN

USB

Wireless LAN Access Point

USB

NotePC

Wireless LAN

RS232C

IEEE 1394

Gyroscope

DV Converter

NotePC

External Antenna

NTSC Omnidirectional

Camera NTSC

USB

HMD Head Tracker

Operator

Data Relay Station

Fig. 3. System Diagram

GPS Omnidirectional Camera Gyroscope, NotePC, Wireless LAN Fig. 4. System Overview

Fig. 5. Data Relay Station

mote sensing. However it is difficult for an operator to control an unmanned helicopter remotely. One reason is that an operator cannot be aware of its attitude when he/she is far away from the helicopter. Another reason is that the coordinate system between the helicopter and the operator changes drastically depending on the attitude of the helicopter. To solve these problems, several studies have been made on autonomous helicopters [1]-[5]. Autonomous helicopters need pre-determined landmarks or flight paths in order to fly, thus they are not suitable for flight tasks where the situation changes every minute such as in disaster relief. Additionally, many

on-board sensors and computers for control are needed. Since the payload of a helicopter is sharply small, autonomous helicopters tend to be large, heavy, and expensive. We proposed an immersive teleoperating system for unmanned helicopters using an omnidirectional camera(Figure 1)[6]. In this system, the operator controls the helicopter remotely by viewing the surroundings of the helicopter through a HMD(Figure 2). The advantage of this system is that it is only necessary to install a camera and a transmitter on the helicopter. Therefore it is possible to use a compact

Head Mounted Display

Gyroscope

Controller Backpack

Laptop PC with WirelessLAN

Fig. 6. Wearable Augmented Reality Environment

helicopter with a small payload, and make it lightweight and cheap. Additionally, it becomes easy to control an unmanned helicopter because a coordinate system between a helicopter and an operator doesn’t change even when the attitude of the helicopter changes. Furthermore, an operator can retain control even when a helicopter is out of the operator’s sight as long as the video image can reach the operator and the helicopter can receive the control signal. However, in this system, it is impossible to control when the image transmission fails or the visual range is poor. In addition, the resolution of a perspective image which is generated from omnidirectional image becomes low and the operator has trouble seeing distant objects. To solve these problems, we are developing an annotation-based assistance system for an unmanned helicopter.

II. . Annotation-Based Assistance System for Unmanned Helicopter We developed an annotation-based assistance system for an unmanned helicopter. Figure 3

and Table I show the configuration and the specification of our system. On the helicopter, an omnidirectional camera and a gyroscope are mounted at the front, and a PC with a wireless LAN and GPS receiver are hung at the bottom(Figure 4). Position/attitude data and an omnidirectional image are sent to the operator during the flight via a wireless LAN through a data relay station(Figure 5). On the ground, a perspective image is generated from a received image, and displayed on the HMD which the operator wears. The displayed image changes depending on the head direction which is measured by the gyroscope attached to the HMD. The operator has an annotation database which consists of the names and the position information of neighboring real objects. Using the database and the current position and attitude of the helicopter, annotations are overlaid on the perspective image which the operator is observing. The direction of the nose of the helicopter, ground speed, map, and the operator’s head attitude are also displayed on the image. The operator holds a controller and controls the helicopter wearing a backpack which contains a laptop PC(Figure 6).

III. . Experiment Using this system, we carried out an experiment to assist search of persons from the image captured by the camera mounted on the helicopter. The experiment was conducted at HEIJYO Palace Site in Nara prefecture. Figure 7 is an overview of the experimental environment. The helicopter took off at position A in Figure 8), and flew around it in a few minutes. Figure 8 also shows the position of textual annotations. Textual annotations are consisted “Fire Department”, “Medical Center” “City Office”, “Elementary School”, “Police Box”, “Library”, and “Evacuation Area”. Virtual buildings are overlaid on the captured image to restore to original state of the city destroyed by the disaster. Additionally, the neighboring map is displayed at the lower left

TABLE I. System Specification Airframe PC

Camera

Capture GPS Gyroscope Access Point Antenna PC

HMD

Gyroscope

Helicopter JR VoyagerGSR Payload: 4.0[kg] SONY PCG-U1 CPU: Crusoe 867[MHz] Memory: 256[MB] Wireless LAN: WLI-USB-KS11G ACCOWLE Omnidirectional Vision Sensor (with Hyperboloidal Mirror) resolution: 512x492[pixel] Canopus ADVC-55 eTrex Vista InterSense InertiaCube2 Data Relay Station BUFFALO WHR2-A54F54 BUFFALO WLE-HG-NDC Operator TOSHIBA DynabookG8 CPU: Pentium4 2.0[GHz] Memory: 768[MB] Wireless Lan: Embedded i-O DisplaySystems i-glasses!LC resolution: 450x266[pixel] InterSense InterTrax2

of the image to inform the position of the helicopter. Figure 9 indicates acquired position and attitude data from GPS and gyroscope mounted on the helicopter, and these data are transmitted to the operator through the Wireless LAN during the flight. Figure 10 shows a sequence of snapshots which the operator views. Around “Evacuation Area” of the textual annotation, we could find a person at the points of a blue circle in Figure 10-(d),(e).

overlaid on the captured image, and we could find moving person in the search area.

Acknowledgement This research is partly supported by Core Research for Evolutional Science and Technology (CREST) Program ”Advanced Media Technology for Everyday Living” of Japan Science and Technology Agency (JST).

Fig. 7. HEIJYO Palace Site

City_Office Police_Box

Elementary_School Medical_Center

IV. . Conclusion

Fire_Department Evacation_Area

In this paper, an annotation-based rescue assistance system for a teleoperated unmanned helicopter with a wearable augmented reality environment was proposed. We conducted an experiment to assist a search operation of persons from the image captured by the camera mounted on the helicopter. To support the operation, textual annotations and virtual buildings were

A Library

Fig. 8. Location of Textual Annotation

3441.485

3441.480

3441.475

3441.470

Latitude

3441.465

3441.460

3441.455

3441.450

3441.445

3441.440 13547.815 13547.820 13547.825 13547.830 13547.835 13547.840 13547.845 13547.850 13547.855 Longitude

(a)Position 200

yaw roll pitch

150

100

Angle[deg]

50

0

-50

-100

-150

-200 160

180

200

220

240 Time[s]

260

280

300

320

(b)Attitude Fig. 9. Position and Attitude of Unmanned Helicopter during Experimental Flight

References [1] Ryan Miller, Omead Amidi, and Mark Delouis, “Arctic Test Flights of the CMU Autonomous Helicopter”, Proceeding of the Association for Unmanned Vehicle Systems International 1999. [2] Kale Harbick, James F. Montgomery, and Gaurav S. Sukhatme, “Planar Spline Trajectory Following for an Autonomous Helicopter”, Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol.8, No.3, pp. 237-242, 2004. [3] Daigo Fujiwara, Jinok Shin, Kensaku Hazawa, Kazuhiro Igarashi, Dilshan Fernando, and Kenzo Nonami, “Autonomous Flight Control of Small Hobby-Class Unmanned Helicopter, Report 1: Hardware Development and Verification Experiments of Autonomous Flight Control System”, Japan Society of Mechanical Engineers, Robotics and Mechatronics Division, Journal of Robotics and Mechatronics, Vol.15 No.5, pp. 537-545 (2003) [4] Kensaku Hazawa, Jinok Shin, Daigo Fujiwara, Kazuhiro Igarashi, Dilshan Fernando, and Kenzo Nonami: ”Au-

tonomous Flight Control of Small Hobby-Class Unmanned Helicopter, Report 2: Modeling Based on Experimental Identification and Autonomous Flight Control Experiments”, Japan Society of Mechanical Engineers, Robotics and Mechatronics Division, Journal of Robotics and Mechatronics, Vol.15 No.5, pp.546-554 (2003) [5] Hiroaki Nakanishi, and Koichi Inoue, “Development of Autonomous Flight Control System for Unmanned Helicopter by Use of Neural Network”, Proceedings of World Congress of Computational Intelligence 2002, pp. 24002405. [6] Masanao Koeda, Yoshio Matsumoto, and Tsukasa Ogasawara, “Development of an Immersive Teleoperating System for Unmanned Helicopter”, Proceedings of IAPR Workshop on Machine Vision Applications 2002, pp. 220223.

(a)

(b)

(c)

(d)

(e)

(f)

Fig. 10. Snapshots of Generated Image Overlaid Annotations and Found Person around the annotation: “Evacuation Area”

Suggest Documents