University of Innsbruck

University of Innsbruck Institute of Computer Science Intelligent and Interactive Systems Implementation of a ROS-Based Robot-Toolkit and Robot-Demo...
Author: Ethel Spencer
11 downloads 0 Views 3MB Size
University of Innsbruck

Institute of Computer Science Intelligent and Interactive Systems

Implementation of a ROS-Based Robot-Toolkit and Robot-Demos

Daniel Eberharter

B.Sc. Thesis Supervisor: Emre Ugur Ph.D. 18th June 2015

Abstract This thesis presents the documentation of a set of tools that were implemented in order to improve the usability of the robot of the IIS research group. These tools use the Robot Operation System (ROS) for communication and are designed to be used as plugins for existing ROS toolkit. The robot plugins developed in the course of this project enable easy control of the robot’s (arm-, hand- and head-) joints, plotting of the robot’s status topics, as well as real-time visualization of the output of the robot’s tactile-sensors. Additionally two graphical-user-interfaces for existing projects, namely object-recognition and tower-building, have been developed, with the aim of introducing a decoupled and visually appealing front-end to these projects. The final part of this project was the implementation of a standalone demonstration that highlights the aspects of a visual attention-mapping algorithm.

i

Acknowledgments I want to thank all members of the IIS research-group for assisting me in the development and testing of this project’s components, especially my supervisor Emre Ugur for giving me constant feedback and guidance.

iii

Contents Abstract

i

Acknowledgments

iii

Contents

v

List of Figures

ix

List of Tables

xi

Declaration

xiii

1 Introduction 1.1 Motivation . . . . . 1.2 Goals . . . . . . . . 1.3 Related Work . . . . 1.4 The IIS Robot Setup

. . . .

1 2 2 3 3

. . . . . . . .

5 5 5 6 6 7 9 9 12

. . . .

15 15 15 16 17

4 rqt 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Creating a Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.3 Building a Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19 19 19 21

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

2 ROS - The Robot Operating System 2.1 Introduction . . . . . . . . . . . . . . 2.2 Core Features . . . . . . . . . . . . . 2.3 Communication . . . . . . . . . . . . 2.3.1 Topics . . . . . . . . . . . . . 2.3.2 Publishing . . . . . . . . . . . 2.3.3 Latching . . . . . . . . . . . . 2.3.4 Subscribing . . . . . . . . . . 2.4 Catkin . . . . . . . . . . . . . . . . . 3 Qt 3.1 3.2 3.3 3.4

Introduction . . Class Structure Implementation Event Handling

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . . . . . .

. . . .

v

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

. . . .

. . . . . . . .

. . . .

5 Robot-Control GUI 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . 5.2 The Components . . . . . . . . . . . . . . . . . . 5.3 The Container . . . . . . . . . . . . . . . . . . . 5.4 Communication - The RobotController . . . . . . 5.5 Headcontrol . . . . . . . . . . . . . . . . . . . . . 5.5.1 Implementation . . . . . . . . . . . . . . . 5.6 Handcontrol . . . . . . . . . . . . . . . . . . . . . 5.6.1 Implementation . . . . . . . . . . . . . . . 5.7 Armcontrol . . . . . . . . . . . . . . . . . . . . . 5.7.1 Implementation . . . . . . . . . . . . . . . 5.8 Controlling of Arms and Head using a Gamepad 6 Visualization of Robot Data 6.1 Introduction . . . . . . . . . 6.2 Data-Format . . . . . . . . 6.3 Data-Parsing . . . . . . . . 6.4 Plotting . . . . . . . . . . . 6.5 Implementation . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

. . . . . . . . . . .

23 23 24 24 25 27 28 29 30 31 32 33

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

35 35 36 36 37 37

7 Visualization of Data from Tactile Sensors 7.1 Introduction . . . . . . . . . . . . . . . . . . 7.2 Structure . . . . . . . . . . . . . . . . . . . 7.3 Data-Format . . . . . . . . . . . . . . . . . 7.4 Data-Parsing . . . . . . . . . . . . . . . . . 7.5 Real-Time Data Input . . . . . . . . . . . . 7.6 Visualization . . . . . . . . . . . . . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

39 39 40 41 42 42 43

. . . . .

. . . . .

. . . . .

. . . . .

8 Object-Recognition Demonstration 8.1 Introduction . . . . . . . . . . . . . 8.2 Communication . . . . . . . . . . . 8.3 Object Visualization . . . . . . . . 8.4 Implementation . . . . . . . . . . . 8.5 Resources . . . . . . . . . . . . . . 8.6 Configuration . . . . . . . . . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

45 45 46 47 48 49 49

9 Tower-Building Demonstration 9.1 Introduction . . . . . . . . . . . . . . . . . 9.2 Communication . . . . . . . . . . . . . . . 9.3 Displaying Kinect Depth-Maps . . . . . . 9.4 Highlighting of Objects in the Depth-Map 9.5 Visualization of Object-Traits . . . . . . . 9.6 Implementation . . . . . . . . . . . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

. . . . . .

51 51 52 53 53 54 55

10 Visual-Saliency Demonstration 10.1 Introduction . . . . . . . . . . 10.2 Color-Detection . . . . . . . . 10.3 Salient Attention-Mapping . . 10.3.1 Movement Detection . 10.4 Head-Following . . . . . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

. . . . .

57 57 58 59 59 60

. . . . .

. . . . .

. . . . .

. . . . . .

. . . . .

. . . . . .

. . . . . vi

. . . . . .

. . . . .

. . . . .

10.5 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

11 Conclusion

63

Bibliography

65

vii

List of Figures 1.1

The robot setup

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

3

2.1 2.2

ROS communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Sequence-diagram of interleaved publishing. . . . . . . . . . . . . . . . . . . . . .

6 7

3.1 3.2

A small sample of the Qt Class hierarchy1 . . . . . . . . . . . . . . . . . . . . . . The designer view of QtCreator. . . . . . . . . . . . . . . . . . . . . . . . . . . .

15 16

5.1 5.2 5.3 5.4 5.5 5.6 5.7 5.8 5.9 5.10

UML RobotControl . . . . . . . . . . UML RobotControl Structure . . . . UML RobotCommunication . . . . . Screenshot of the HeadControl GUI A model of the HeadControl class. . Screenshot of the HandControl GUI A model of the HandControl class. . Screenshot of the ArmControl GUI . A model of the ArmControl class. . . A model of the GamePad class. . . .

. . . . . . . . . .

23 25 25 27 28 29 30 31 32 33

6.1 6.2

Screenshot of the ArmPlotter GUI . . . . . . . . . . . . . . . . . . . . . . . . . . UML ArmPlotter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

35 37

7.1 7.2 7.3

Screenshot of the HandPlotter GUI . . . . . . . . . . . . . . . . . . . . . . . . . . AN UML diagram showing the structure of the HandPlotter-plugin. . . . . . . . HSL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

39 40 43

8.1 8.2 8.3

A screenshot from the ObjectRecognition GUI on startup. . . . . . . . . . . . . . A diagram that shows the data-flow between the relevant components. . . . . . . UML ObjectRecognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45 46 48

9.1 9.2 9.3

A screenshot from the TowerBuilding GUI on startup. . . . . . . . . . . . . . . . An UML diagram representing the structure of the BarGraph widget. . . . . . . An UML diagram that shows the structure of the ObjectStacking plugin. . . . .

51 54 55

10.1 10.2 10.3 10.4 10.5

Examples for attention-mapping . . A screenshot from the VisualSaliency Example for color-detection . . . . . Example for saliency-mapping . . . . UML VisualSaliency . . . . . . . . .

57 58 58 59 61

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . GUI . . . . . . . . .

ix

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . on startup. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . . . . . . .

. . . . .

. . . . .

List of Tables 5.1 5.2

Topics for RobotControl . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Descriptions of the used topics for controlling the robot. . . . . . . . . . . . . . .

26 26

6.1

The data-format of the ROS-topic-plotter tool. . . . . . . . . . . . . . . . . . . .

36

7.1

The message-type of the data from the tactile-sensors. . . . . . . . . . . . . . . .

42

8.2

The format of the demo-communication-frame. . . . . . . . . . . . . . . . . . . .

46

9.1

Topics for TowerBuilding correspondence. . . . . . . . . . . . . . . . . . . . . . .

52

xi

Declaration By my own signature I declare that I produced this work as the sole author, working independently, and that I did not use any sources and aids other than those referenced in the text. All passages borrowed from external sources, verbatim or by content, are explicitly identified as such.

Signed: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Date: . . . . . . . . . . . . . . . . . . . . . . .

xiii

Chapter 1

Introduction A decade ago the usage of robots in everyday life was hardly imaginable and was only associated with science-fiction-movies. The facts that robots would vacuum-clean houses and cars would park autonomously have become normal things nowadays and future ambitions exceed each year, which drives expectations regarding research in this area into great heights. How much scientific work stands behind an innovation like a cleaning-robot stays mainly unnoticed by the public, therefore it is important to create options that allow the researchers to introduce their topic in an easily understandable way. The term demonstration has very broad interpretations, and can be anything from advertising the status of the research in a practical way to a fully interactive user-experience. In the course of this project several demonstrations have been implemented and there are two main goals that they aim to fulfill. The first objective, as with every demo, is to present the processes of an application in an easy-to-understand and visually appealing manner. The second goal is to decouple presentation-layer of projects from their application-layer. This has the big benefit of making them more cohesive and increases the maintainability. The first demonstration is based on the object-recognition procedures developed by Özgür Erkent Ph.D. in the course of the EU-project 3rd-hand robot1 . The Graphical User Interface (GUI) visualizes recognized components in a scene and displays object traits. The second demonstration allows the user to interactively define the parameters for the towerbuilding demo developed by Emre Ugur Ph.D. and displays the data calculated by the demo back-end. The GUI lets the user select the type of the tower the robot should build and does the necessary communication with the demonstration. The demonstration’s back-end recognizes traits of objects in a scene using input from Kinect-cameras and sends the associated data to the GUI, which visualizes these results accordingly. The last demonstration is a standalone application with the goal of displaying the difference between an visual-attention-mapping algorithm(Laurent Itti, 1998) and an arbitrary colordetection implementation. The demonstration uses the visual data provided by the camera on the IIS-robot’s head and calculates the center of attention in the scene, mimicking the human behaviour. An option of head-following has been implemented, where the robot incrementally aligns its head (and therefore the camera) in order to bring the object into the center of view. The used algorithms for attention-mapping and color-detection have been extended in order the detect movement in a scene. Since temporal behaviour has very strong influence on the attention-evaluation of objects in the human brain, this is a important trait of a salient algorithm(Li and Gao, 2014).

1

http://3rdhandrobot.eu/

1

2

CHAPTER 1. INTRODUCTION

The second part of the project focuses on applications to ease the use of the robot and enable the possibility the visualize important status information. The first of these tools is the robot-control plugin, that allows the user to move the joints of the robot’s hardware, trigger gripping, and observe relevant parameters, like temperature changes in the system. Additionally the usage of a gamepad or joystick for controlling the robot has been integrated. The other robot-based tools focus on visualization of recorded robot data. The first of these tools enables the user to plot graphs of multiple status-values of the robot (such as temperature or torque) and compare them to each other. The second plugin is used for graphically displaying the applied pressure to the tactile-sensors on the robot’s hands. While both of these tools can parse their input-data from files, the latter has the possibility to visualize real-time behaviour through the usage of ROS-subscription.

1.1

Motivation

Since this project consists of two major parts, one being the robot-tools and the second being the robot-demos, these parts each have very different motivation. The goal of the object-recognition- and tower-building-demo-plugins is to create a decoupled front-end for existing demos. This is designed in a way so that the functionality of the demonstration’s GUI is completely independent from the back-end implementation. Lastly the visual-attention-demo shows the capabilities of attention-mapping algorithms and gives the user the opportunity to compare that algorithm to a simple color-tracking algorithm. The used implementation for attention-mapping does not take movement into account. IT was therefore another contribution to this project to implement simple movement-detection, instead of using an existing algorithm. The developed robot-tools serve the purpose of easing the everyday work with the robot and assist in tasks like arm-positioning, head-movement, gripping and status observation. The main development-focus has been on usability, in order to make the usage of the software as intuitive as possible. This is necessary to convince people to get rid of known usage patterns and start using these new tools.

1.2

Goals

The goal of the project was the implementation of the following plugins: • Controlling of the robot’s arms, hands and head (chapter 5) • Visualization of information that is received from the robot (chapter 6) • Specifically visualizing data of the robot’s tactile-sensors in real-time (chapter 7) • Implementing a control interface for a object-recognition demonstration (chapter 8) • Implementing a control interface for a tower-building demonstration (chapter 9) • Implementation of a module for visual-attention-mapping (chapter 10) All of these goals have been realized as standalone projects and have no dependencies to each other.

1.3. RELATED WORK

1.3

3

Related Work

The big difference of generalized ROS tools to the plugins developed in this project is, that the functionality of existing tools is as general as possible, while the plugins developed for the IIS research-group are highly specialized and contain no overhead that might distract the user from the main features. This specialization results in an overall easier-to-use plugin-collection, that doesn’t require much setup. The developed demonstrations and GUIs are unique in their use-case and cannot be compared to any conventionally used software.

1.4

The IIS Robot Setup

The robot referenced throughout the thesis is shown in figure 1.1 and is located at the University of Innsbruck.

Figure 1.1: The IIS robot setup at the University of Innsbruck2 . The robot’s main components are the two Kuka LWR4+ arms (each with seven degrees of freedom), two Schunk SDH grippers (with seven degrees of freedom) and the Karlsruhe Humanoid Head Tamim Asfour (2008), which has seven degrees of freedom and offers data from four frontfacing cameras. Additionally the robot has a Kinect camera mounted at it’s torso, which is used by the ObjectRecognition demonstration developed (see section 8) in this project.

2

source: Grieser (2014)

Chapter 2

ROS - The Robot Operating System 2.1

Introduction

The Robot Operating System (ROS) is a library-collection that enables simple hardware-communication over the network. It has its origin at the Stanford Artificial Intelligence Laboratory and is today published as an open-source project under the BSD-license1 . ? The basic installation of ROS comes with the a tool called rqt, which is a collection of useful plugins for working with ROS (see chapter 4). All tools and demos developed in the course of this project are implemented as plugins for rqt. This chapter introduces the main features of ROS, how communication between entities works with ROS and how to build packages that support this communication.

2.2

Core Features

The main features of ROS are: 1. Network Communication ROS allows the communication between clients in a network via data-writing (publishing) and -reading (subscribing) to so-called topics. Topics are basically channel-ID’s used for identifying the communication’s content(see chapter 2.3). 2. Hardware Abstraction The functionality of ROS is completely decoupled from the hardware of the underlying node. This behavior makes the exchange of physical components, without the need of changing the implementation above, possible. 3. Package-Management ROS provides built-in functionality for maintaining the state of the installed components. All installed ROS components are kept up-to-date through ROS’ built-in package manager. This package manager also enables intuitive installing of additional functionality. 4. Wide Language- and Environment-Support Currently libraries are provided for integrating ROS-communication into code of various languages (e.g.: C/C++, Phython, etc.) and different environments (e.g.: Matlab). 1

http://www.linfo.org/bsdlicense.html

5

6

CHAPTER 2. ROS - THE ROBOT OPERATING SYSTEM

2.3

Communication

The communication offered by ROS is based on publishing and subscribing to topics, where a topic is simply a named data-container that saves only the incoming data (see section 2.3.3) and pushes status-updates to all subscribing nodes. A node is a client in a ROS-network and can be a subscriber, a publisher or both. Each node has to be identified by a node-name, which has to be unique in the network. The information exchange within a ROS network can be 1:1, 1:n, n:1 or n:m.

Figure 2.1: An example for ROS-based inter-device communication.2 Figure 2.1 shows a example setup of a desktop PC that is connected with the robot and serves as the ROS master, which means that all nodes in the ROS network have to register at that node to be able to send and received data. This setup features a client (Laptop) that registers at the master and subscribes to the topic image_data. This means, once the Camera Node receives data from the camera and publishes it to that topic, the client (Laptop) will receive the image and can process it further. This procedure would work analogous for multiple receiving nodes.

2.3.1

Topics

Topics are the central posts for data-transmission. Each publishing of data has to be directed to exactly one topic, which means that topic names have to be locally unique. In the case of the IIS-setup robot-topics are named based on the following pattern: execution_type/hardware/control_type/control_topic execution_type  { simulation, real } hardware  { head, left_arm, right_arm, left_sdh, right_sdh }

2

source: http://robohub.org/wp-content/uploads/2014/01/ros101-3.png

2.3. COMMUNICATION

7

Additionally all camera components start with the prefix "camera/...". Each topic is associated with a specific message-type, which defines the format of the transported data. There are several basic message-types (like integer, float, string, etc.), but the user is allowed to create custom message-types constructed out of basic ones. The one constraint when using custom message-types is that the sending and the receiving end of the communication are obligated to have local knowledge of the structure of that type.

2.3.2

Publishing

Publishing data to topics is the main way of communication used in this project, and most basically achieved by the following lines of code.3 . Listing 2.1: SimplePublishing 1 2 3 4 5 6 7 8 9 10

# include < ros / ros .h > [...] ros :: NodeHandle n ; ros :: Publisher pub = n . advertise ( _topic , _buf_size , _enable_latching ); pub . publish ( _data ); ros :: spinOnce (); [...]

This code allows publishing of _data, with data-type T, to _topic. The variable _enable_latching is used to control data latching (see chapter 2.3.3). Normally this publishing is done within a loop, since has to be published multiple times in order to guarantee the transmission by ROS. This method is problematic when sending joint-data to the robot, because once the published data changes it is possible that old and new data will be received interleaved. Old and new joint-values that are received by the robot in an interleaved way, as shown by figure 2.2, will create joint-stuttering. This term describes the rapid switching of the robot’s joints between the old and the new values. Depending on the value-difference between the interleaved datasets this can lead to dangerous behavior of the robot’s components.

Figure 2.2: Sequence-diagram of interleaved publishing. 3

Since the project was implemented using C++ as programming language, the code examples given in this thesis are all written in C++.

8

CHAPTER 2. ROS - THE ROBOT OPERATING SYSTEM

Figure 2.2 shows how this interleaved publishing behavior can be produced. Once a jointvalue of a robot-component is changed by the user, it will be published for a certain time-duration to ensure that the message is sent correctly. If the joint-value changes during this publishing timeframe, it will lead to another publishing instruction. This leads to a scenario where two publishers write their data at the same time and therefore constantly produce new movement instructions for the subscribing movement-controller of the robot. In the case of the robot-control plugin it is not necessary to ensure the correct transmission of each joint-value. It is sufficient that the robot receives some joint updates. The following code example introduces a queued publisher-thread, where each joint-value will only be send to the robot exactly once. Using this adaption the joint-stuttering has been completely eliminated. Listing 2.2: PublisherThread 1 template < class T > 2 class PublisherThread : public QThread { 3 private : 4 std :: string _topic ; // the topic to publish to 5 std :: queue < T * > _msg_q ; // queue of message data 6 double _frq ; // publishing frequency 7 bool _finish ; // flag for killing the publisherthread 8 int _size ; // size of the message - buffer 9 [...] 10 11 public : 12 13 void publish ( T * msg ) { 14 _msg_q . push ( msg ); 15 } 16 17 void run () { 18 ros :: NodeHandle n ; 19 ros :: Publisher chatter_publisher = n . advertise ( _topic , 20 _size , 21 true ); 22 23 while (! _finish ) { 24 if (! _msg_q . empty ()) { 25 chatter_publisher . publish (*( _msg_q . front ())); 26 delete _msg_q . front (); 27 _msg_q . pop (); 28 } 29 30 ros :: spinOnce (); 31 usleep (1.0 / _frq * 1000000.0); 32 } 33 34 cha tter_publisher . shutdown (); 35 n . shutdown (); 36 } 37 38 [...] 39 }

2.3. COMMUNICATION

9

This publisher is implemented as a QThread, which means that the publishing-frequency can be adjusted through the sleep-call without delaying the GUI. Once the publisher is started it will constantly publish the data in the message queue (msg_q) until the queue is empty. Each data segment will only be published once. This behavior is necessary in order to avoid stuttering of robot components when the data that has to be published changes quickly (e.g. through fast slider movement in the robotcontrol-GUI). This basic implementation of a PublisherThread is used throughout the whole project.

2.3.3

Latching

Latching is a mechanism where a topic is forced to save the latest data received. Upon subscribing to a latched topic the subscribing node immediately receives the most recent value, even if none has been published during the period of subscription. ROS (2015b) This behavior is generally useful when facing asynchronous 1:n or n:m communication, which is generally not the case in this project. Latching can also cause negative side-effects. An example for wrong behavior when using latching would be the communication between the front- and back-end of the tower-building demo (see chapter 9.2). The GUI sends a message containing "start" to the demo back-end when the user presses the start-button. If that message would be latched, the demo’s back-end would receive "start" on every start-up (because a start-up implies subscribing the GUI topic), which would be wrong. Data-latching can be toggled by setting a flag during advertising a new publisher. Sending data using a non-latching publisher will signal the receiving node not to latch the incoming data. Listing 2.3: PublisherThread 1

ros :: Publisher pub = n . advertise ( _topic , _size , _use_latching );

2.3.4

Subscribing

Subscribing to a topic is achieved through the usage of callback-method. Each time a ROStopic changes its value, all subscribing nodes call the associated callback-method, which has the new data as argument. This data-fetching procedure is called data-pulling. The following code contains a basic example for subscription: Listing 2.4: SimpleSubscribing 1 2 3 4 5 6 7 8 9 10 11 12 13 14

void run () { ros :: NodeHandle n ; ros :: Subscriber sub = n . subscribe ( _topic , _buf_size , & Subscriber :: callback , this ); while ( ros :: ok ()) { ros :: spinOnce (); } } void callback ( const boost :: shared_ptr data ) { [...] }

10

CHAPTER 2. ROS - THE ROBOT OPERATING SYSTEM

It is notable that the main loop of the subscriber blocks the whole program until the subscriber is terminated. Since this behavior would render the GUI inoperable, the subscriber is implemented as a QThread and runs parallel to the main application. The callback’s body is usually processing of the data. Since this project’s goal is to implement graphical components, it is sought to hand this data over to the Qt body, which is normally achieved via QSignals. Unfortunately the usage of QSignals within a class that extends QThread, is currently not supported by Qt4 . To solve this issue the implementation of a notifier class is necessary, which signals subscriber-updates to other components. Listing 2.5: Notifier 1 # include < QObject > 2 3 class Notifier : public QObject { 4 Q_OBJECT 5 public : 6 Notifier () {} 7 8 void notifiy () { 9 Q_EMIT notification (); 10 } 11 12 Q_SIGNALS : 13 void notification (); 14 };

Taking these adaptions into account, the structure of the final subscriber-thread looks as follows: Listing 2.6: SubscriberThread 1 template < class T > 2 class SubscriberThread : public RosThread { 3 private : 4 std :: string _topic ; 5 boost :: shared_ptr _data ; 6 bool _finish ; 7 Notifier * _notifier ; 8 9 public : 10 [...] 11 12 void run () { 13 ros :: NodeHandle n ; 14 ros :: Subscriber sub = 15 n . subscribe ( _topic , 16 _buf_size , 17 & SubscriberThread :: callback , 18 this ); 19 20 while ( ros :: ok () && ! _finish ) { 21 ros :: spinOnce (); 22 usleep (1000000); 23 } 24 } 25 26 27 4

With reference to Qt 4.0.

2.3. COMMUNICATION 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 }

11

void callback ( const boost :: shared_ptr data ) { if ( _finish || ! data ) return ; _data = data ; _notifier - > notifiy (); } Notifier * getNotifier () { return _notifier ; } boost :: shared_ptr getData () { return _data ; }

The main thread can simply use the notifier reference received through getNotifier() to bind the QSignal to a corresponding handler-method, which can then access the actual data through getData().

12

2.4

CHAPTER 2. ROS - THE ROBOT OPERATING SYSTEM

Catkin

Catkin is the build-system integrated in ROS and allows library-decoupled building of projects, by specifying dependencies and build-settings in a configuration file called CMakeLists.txt. ROS (2015a) Usually there is one CMakeLists-file per project, but since the separate software-parts developed in this project are independent of each other, each has it’s own CMakeLists-file. The general structure of such a file looks as follows: Listing 2.7: CMakeLists 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44

project ( myproject ) # dependencies go here find_package ( catkin REQUIRED COMPONENTS roscpp ... ) # include ibrary directory in cl u d e_ d i re ctories ( include $ { catkin_INCLUDE_DIRS } $ENV { IIS_INCLUDE_PATH }/ external_libs / SchunkLib / sdh $ENV { IIS_INCLUDE_PATH }/ external_libs / ReflexxesTypeII / include $ENV { IIS_INCLUDE_PATH }/ external_libs / ReflexxesTypeII / include $ENV { IIS_INCLUDE_PATH }/ external_libs / SchunkLib / $ENV { IIS_INCLUDE_PATH }/ external_libs / SchunkLib / demo / ) # define package catkin_package ( INCLUDE_DIRS include LIBRARIES $ { PROJECT_NAME } CATKIN_DEPENDS message_runtime ) # tell cmake where header , source and ui files are file ( GLOB QT_FORMS RELATIVE $ { C M AK E _ C UR R E N T_ S O U RC E _ D IR } *. ui ) file ( GLOB_RECURSE QT_MOC RELATIVE $ { C M AK E _ C UR R E N T_ S O U RC E _ D IR } FOLLOW_SYMLINKS *. hpp ) file ( GLOB_RECURSE QT_SOURCES RELATIVE $ { C M AK E _ C UR R E N T_ S O U RC E _ D IR } FOLLOW_SYMLINKS *. cpp ) file ( GLOB QT_RESOURCES RELATIVE $ { C M AK E _ C UR R E N T_ S O U RC E _ D IR } *. qrc ) # moc headers QT4 _ADD_R ESOURCES ( QT_RESOURCES_CPP $ { QT_RESOURCES }) QT4_WRAP_UI ( QT_FORMS_HPP $ { QT_FORMS }) QT4_WRAP_CPP ( QT_MOC_HPP $ { QT_MOC }) # define target executable add_executable ( $ { PROJECT_NAME } $ { QT_SOURCES } $ { QT_FORMS_HPP } $ { QT_RESOURCES_CPP } $ { QT_MOC_HPP }) ta r g e t _ l i n k _ libr arie s ( $ { PROJECT_NAME } $ { catkin_LIBRARIES } $ { QT_QTCORE_LIBRARY } $ { QT_QTGUI_LIBRARY } $ { PCL_INCLUDE_DIRS })

2.4. CATKIN

13

Additionally to these compile-time dependencies it is necessary to define runtime dependencies for the project. This is done using the Package.xml file. Listing 2.8: Package.xml 1

Suggest Documents