Final Report – 5HC99 Course 2013 Group 2

Ashish Bhatnagar Matthias Schneider PrithviRaj Narendra

Contents Introduction.................................................................................................................................................. 2 Our Objectives ........................................................................................................................................ 2 Hardware Design of the Quadcopter....................................................................................................... 3 Cost list .................................................................................................................................................. 13 Hardware architecture of our system ................................................................................................ 14 Software architecture ............................................................................................................................... 20 PID tuning .............................................................................................................................................. 20 Local control – OpenPilot .................................................................................................................... 27 Overview of OpenPilot ........................................................................................................................ 27 Strategy for autonomous control of local control (OpenPilot) by Global Control .............................. 28 Global control – Application using OpenCV on R-Pi ...................................................................... 32 Problems faced and solutions ..................................................................................................................... 36 Ultrasound sensor not reliable – investigation, solutions tried and finally use of laser pointer............ 36 R-Pi camera drivers and frame rates for processing images .................................................................. 39 Lack of documentation of OpenPilot Project.......................................................................................... 39 Project Status............................................................................................................................................ 40 Conclusions ............................................................................................................................................... 42 Appendix A - Work division table ........................................................................................................... 43

1

Introduction This report is a concise documentation of the work done as part of the course Embedded Visual Control (5HC99) conducted at Technical University of Eindhoven. The course aimed at combining the knowledge in the fields of Embedded Systems, Computer Vision and Control Theory. The project involved building a quadcopter, which is capable of carrying out some useful tasks. This project was subdivided into a set of subtasks. These tasks build on each other to result in a quadcopter that is able to navigate autonomously using cameras.

Our Objectives Our objectives were divided into 5 assignments as follows: A. Building the Quadcopter As the first task, we had to build a quadcopter. We were supposed to identify, order, and finally assemble all the required components. B. Tuning the PID Values for Local Control For the quadcopter to fly in a stable manner this task involved identifying good local control parameters. All the further tasks depended greatly on the proper tuning of the PID values. C. Hover and Autonomously Take a 360° Panoramic Image This task involved flying the quadcopter autonomously at a certain pre-decided altitude. While making a 360° rotation, it was supposed to take some snapshots using the front camera. These snapshots were supposed to be stitched together to obtain a panoramic image. D. Follow Markers This task was meant to enable the quadcopter to fly autonomously in a room. The quadcopter should be able to locate special markers pasted on the walls of the room and autonomously navigate from one marker to another. E. Pick an Additional Task As an additional task, we decided to use the markers mentioned in the previous task to find a landing site when the battery of the quadcopter was below a certain level.

2

Hardware Design of the Quadcopter Frame The structural basis of the quadcopter is the frame, to which the motors and other components are mounted. We decided on a large styrofoam frame kit consisting of the styrofoam structure, fiberglass bars and wooden motor bases holding the motors. This frame was a good choice for our project, mainly because of the low price (about 22€) and the flexibility of installing components on the styrofoam. Bumpers around the propellers protect those from damage in case of a collision. Figure 1 shows the frame including the wooden motor bases and the fiberglass rods.

1

Figure 1: Quadcopter frame

1

Source: http://www.hobbyking.com/hobbyking/store/__25420__Extra_Large_EPP_Quadcopter_Frame_450mm_835mm_t otal_width_.html

3

Motors The quadrotor’s propellers are run by four brushless electromotors. These brushless motors have permanent magnets attached to the outer casing, while the stator is inside and contains several electromagnets. These electromagnets are powered and controlled through an Electronic Speed Controller (ESC). We decided on the Turnigy DST-1200 Brushless Bell Motor 1200kv, which fits the motor bases of the frame well, is well dimensioned for the size of the quadrotor, and is reasonably priced (about 6,20€). The motor is connected to the ESCs using 3 cables, which supply power to the electromagnets according the speed setting fed into the ESC. Figure 2 shows the motor.

Figure 2: Brushless DC motor

2

Electronic Speed Controllers (ESCs) As discussed in the previous section, every motor is controlled using an Electronic Speed Controller (ESC). We chose to use the HobbyKing SS Series 10-20A ESC. The ESC has a power input and a PWM input used to control the motor speed. From these, the ESC generates the input signal for the motor. Each ESC can supply a maximum power of 18A for the motor. Furthermore, the ESC also generates a constant 5V DC voltage, which can be used to power components like the main flight controller, the RC 2

Source: https://www.hobbyking.com/hobbyking/store/uh_viewItem.asp?idProduct=26487

4

receiver, or the global control computer. This feature is called BEC (= Battery Eliminator Circuit) and supplies a maximum current of 2A. Our main reasons for choosing this model of ESC were the price (about 6,10€) and the compatibility with our motors. Figure 3 shows an ESC. On the right, we can see the cables for power and PWM signal input, on the left the three output cables labeled A, B, and C.

Figure 3: Electronic Speed Control

3

Later, we realized that the choice of ESC might not have been ideal. Often, the firmware within these ESCs is not ideally tuned for the use with quadrotors. The speed of motor control signals may often be brought to higher speeds than the original firmware allows. Also, it is possible that the ESC firmware will preprocess (e.g. filter) the speed control signal rather than immediately using it. This may hurt the reaction speed, which is not good for the stability of the quadrotor. The issue is described online4 and it is possible to download alternative firmware solving these issues. However, the motor controllers we used are protected against replacing the firmware. Therefore, we were not able to improve the stability by reflashing the firmware. 3 4

Source: http://www.hobbyking.com/hobbyking/store/__6457__hobbyking_ss_series_18_20a_esc.html http://semmel018.bplaced.net/filemanager/index.php/elektrik/7-esc-flashen-mit-simonk.html

5

Flight Controller Board (local control) The software for local control runs on the flight controller board. This board has to include a processor to run the flight control software, inputs and outputs for RC controller signals and motor control signals, and a full set of 3D accelerometers and gyroscopes as well as a magnetometer. There are a number of open-source projects for flight controller software and each project usually only supports a certain set of flight controller hardware platforms (see section “Local Control”). Essentially, this means that the choice of the flight controller board is also a choice for a specific open-source flight controller project. We have chosen the STM32F3DISCOVERY board by STMicroelectronics. This hardware includes all the inputs, outputs, and sensors required and it is supported by the OpenPilot flight controller software. The price for a board is 13,45€. It can be programmed with the software using a USB connection, which is very convenient. Figure 4 shows the STM32F3DISCOVERY board.

Figure 4: DiscoveyF3 board for local control

5

5

Source: http://www.st.com/web/en/catalog/tools/FM116/SC959/SS1532/PF254044

6

Remote Control hardware The goal of this project is to develop an autonomous quadrotor, which can self-sufficiently fly and navigate. However, during development it is first necessary to control the quadrotor manually, using a remote control system. This remains an important safety feature, as it should always be possible to intervene manually if the automatic control fails. Therefore, it was necessary to use a remote control system consisting of a transmitter and a receiver. These systems are very common and heavily used in the field of hobby RC electronics. A RC system has a limited number of channels, and we had to make sure that the system we use is capable of carrying enough channels to control a quadrotor. It is necessary to control pitch, roll, yaw, and throttle. Therefore, at least a 4-channel RC system was required. We decided on the FlySky FS-CT6B transmitter and the corresponding receiver (FS-R6B). This is a basic, relatively inexpensive (38,93€) 6-channel RC set perfectly suited for our needs. Figure 5 shows this system.

Figure 5: 6 channel remote control

6

6

Source: http://www.rcgroups.com/forums/showthread.php?t=1387731

7

Bluetooth It is possible to connect the flight controller software with the Ground Control Station (GCS), a piece of software running on a normal PC that receives telemetry data from the quadrotor and visualizes it on screen. Furthermore, GCS can be used to change configuration parameters like PID values (see section “PID Tuning”). It is possible to create a connection between the local control board and the GCS software via USB, but it is not very convenient - especially in flight. Additionally, the control board’s position on the quadrotor frame should stay fixed after calibration. If a USB cable is unplugged after a calibration, the board may shift and potentially change the calibration parameters, leading to a more unstable system. For these reasons, it was necessary to create a wireless connection via Bluetooth, which was the natural choice, since the software already contains all the necessary drivers. We chose the JY-MCU Bluetooth Wireless Serial Port Module, because it offers all the necessary functions at a low price (approx. 6,60€). Figure 6 shows the Bluetooth module.

Figure 6: Bluetooth serial module

7

7

Source: http://dx.com/p/jy-mcu-arduino-bluetooth-wireless-serial-port-module-104299

8

Ultrasonic sensor An important part of stabilizing the quadrotor is being able to hold an altitude. It is necessary to be able to measure the current altitude in order to hold it. An ultrasonic sensor was intended to provide the necessary distance measurements. We chose the HC-SR04 Sensor Module, because of the low price (approx. 2,60 €) and high popularity. This popularity leads to widely available driver software and documentation. However, during the course of the project we learned that the sensor is very sensitive to ambient noise and can be rendered unusable by propeller noise (see section “Problems faced and solutions”). Overall, the sensor is quite accurate (better than 1cm), but not usable in flight, our main usage scenario. Figure 7 shows the sensor.

Figure 7: HCSR04 ultrasonic distance sensor

8

Laser pointer Due to the problems faced using the ultrasonic sensor, we decided to implement a laser altimeter using a laser pointer and the down facing webcam. For this purpose, we bought a laser pointer (2,85 €). In choosing one, we had only one specific requirement: It should be able to run off an input voltage of 5V, which is already provided by the ESCs.

8

Source: http://letsmakerobots.com/node/30209

9

Global control The global control computer has to be able to use two cameras (front-facing and down-facing), process the images obtained, and forward commands to the local control board. We decided to use a Raspberry Pi Model B (39,-€, including case). The main reason for choosing this platform was that a group member already owned a Raspberry Pi, so testing and development could begin immediately. Also, no further costs were incurred. The Raspberry Pi runs a Linux system on an ARM11 processor. It offers 2 USB ports, which could be used for a USB webcam and a WiFi Dongle. A CSI connector is available to connect the camera board. The operating system is installed on an SD card. For this purpose, we bought an 8GB SD card, which had been shown to work well the the Raspberry Pi9. The card we chose was an 8GB SanDisk SDHC Class 10 card (8,47€). The USB WiFi dongle was also chosen based on the compatibility with Raspberry Pi10 (14,27€). Figure 8 shows the Raspberry Pi Model B board.

Figure 8: Raspberry Pi single board computer

11

9

http://elinux.org/RPi_SD_cards http://uk.farnell.com/element14/wipi/dongle-wifi-usb-for-raspberry-pi/dp/2133900?Ntt=2133900 11 Source: http://en.wikipedia.org/wiki/Raspberry_Pi 10

10

Front-Facing Camera: Raspberry Pi Camera Module One of the tasks to be accomplished is taking a panoramic image during flight. To be able to do this and also detect markers placed on the walls, it is necessary to have a camera placed on the side of the quadrotor. This camera should support higher resolutions to be able to take high-quality images during flight. We decided to use the Raspberry Pi camera module and acquired one once it became available. The camera is connected to the Raspberry Pi via a ribbon cable to the CSI connector. This has the advantage that the camera does not take up a USB port on the Raspberry Pi. Furthermore, the camera module already encodes images in JPEG format in hardware, leading to better performance when recording JPEG-images to the SD card. One disadvantage of the camera board is the fragile ribbon cable used to connect it. We have not had a specific issue with the connection during the course of this project, but it seems like a likely point of failure. The camera module cost approximately 23,60€. It is shown in Figure 9.

12

Figure 9: Camera module for Raspberry Pi

12

Source: http://www.raspberrypi.org/wp-content/uploads/2013/02/camerafront.jpg

11

Down-Facing Camera: Microsoft LifeCam HD 3000 Aside from the front-facing camera, it is important to have a down-facing camera. The camera is pointed at the ground. Using images from this camera, the quadrotor should be able to detect if it is drifting to the side and correct for this drift. Our main consideration in choosing an appropriate webcam was again compatibility with the Raspberry Pi, which has to process the images from this camera. We chose Microsoft’s LifeCam HD 3000, because it had been shown to be compatible13 and it is reasonably priced (19,90€). Figure 10 shows the webcam. Before mounting the camera to the frame, we removed the heavy camera stand.

Figure 10: Webcam for downward facing camera

13 14

14

http://elinux.org/RPi_USB_Webcams http://www.microsoft.com/hardware/en-gb/p/lifecam-hd-3000

12

Cost list One of the main focus points for our project was the approach of using relatively low-cost components. Table 1 lists the costs for the individual components. Compared to the previous year’s “recommended” parts list, we have total cost savings of about 35%, with no significant decrease in performance. The list shows the components as described in the previous section. One additional motor had to be purchased because a bearing of one of the originals broke during testing. Furthermore, for development we purchased one additional ultrasonic sensor and an additional flight control board so multiple team members could work and test simultaneously. These costs have not been added to the list, because the list is meant to illustrate the basic costs for the final used hardware.

Amount 1 1 4 4 1 1 1 1 1 1 1 2 3 1 1 1 1 1 1 1

Part Description Frame Battery Motor ESC Connectors Shipping Hobbyking SD Card 8Gb Battery Charger RC Transmitter Bluetooth Adapter USB to TTL Converter HC-SR04 Ultrasonic Sensors Styrofoam Glue Webcam Downfacing Laser Pointer Raspberry Pi Camera Board Raspberry Pi WiFi Dongle Raspberry Pi incl. Case Propeller Set Flight Controller Board

Table 1: Cost list for building the quadcopter

13

Price EUR Total EUR 22,01 € 22,01 € 24,28 € 24,28 € 6,21 € 24,85 € 6,09 € 24,37 € 4,34 € 4,34 € 2,43 € 2,43 € 8,47 € 8,47 € 29,99 € 29,99 € 38,93 € 38,93 € 6,61 € 6,61 € 2,18 € 2,18 € 2,58 € 5,39 € 19,90 € 2,84 €

5,16 € 16,17 € 19,90 € 2,84 €

23,63 € 14,27 € 38,99 € 4,75 € 13,45 €

23,63 € 14,27 € 38,99 € 4,75 € 13,45 € 327,60 €

Hardware architecture of our system The connection of the major hardware components is illustrated in Figure 11. We have to distinguish three major networks. The power distribution network supplies all components with power by connecting them to the main flight battery. The ESCs also have a 5V output and supply power to all components needing 5V. The second network includes the components involved in local control. They are all connected to the DiscoveryF3 flight control board. This includes the control lines for the ESCs, the RF transceiver, the sonar altimeter, and the Bluetooth module for telemetry. Finally, the global control network includes the components connected to the RaspberryPi. This is the down facing USB webcam, the front facing camera module, and the USB WiFi dongle.

Figure 11: Hardware architecture of the quadcopter

14

Figure 12 shows a top-view of the finished aircraft. From the top, we can see the motors & propellers, the flight control board (center), the RaspberryPi in its case (front), the bluetooth module, and the RF transceiver.

Figure 12: Top-view of the quadcopter

15

Figure 13 shows a bottom view of the quadcopter. Here we can see the motor controllers, the down facing webcam, the laser pointer for altitude measurement, and the ultrasonic sensor.

Figure 13: Bottom view of the quadcopter

16

Figure 14 shows a close-up of the ultrasonic sensor and a motor ESC. We tried to mount the ultrasonic sensor close to the center of the quadcopter, so the altitude measurement would be accurate relative to the center.

Figure 14: Ultrasonic sensor and ESC

17

The concept of the LASER based altimeter and the down facing webcam is explained in section ”Ultrasound sensor not reliable”. Figure 15 shows how the webcam and laser pointer are mounted to the frame. The webcam is still in its original case and is cradled in Styrofoam and stabilized with a lot of tape. This is to prevent the camera from shifting its field of view, because this needs to stay fixed according to the calibrated values for the laser altimeter setup.

Figure 15: Laser pointer and webcam

18

The RaspberryPi has a set of General Purpose Input / Output Ports (GPIO). We have used a multiport connector cable to connect power, ground, UART input, and UART output. We used a clear plastic case, which the RaspberryPi snaps into. We fixed the case to the quadcopter frame using cable ties. The WiFi dongle and the down facing webcam are connected via USB. Finally, a ribbon cable interfaces the RaspberryPi Camera Module via the CSI connector. Figure 16 shows this setup.

Figure 16: RaspberryPi and Camera Module

19

Software architecture PID tuning The OpenPilot control software applies PID control loops to control the motor speeds for the four motors of the quadcopter. In general, PID (proportional-integral-derivative) controllers take a current, measured value for a process parameter and compare it to a desired value. They use the difference (“error”) to estimate the output needed to minimize this error. In the case of the quadcopter, each axis (roll, pitch, yaw) is controlled by a separate PID loop. Figure 17 illustrates how the loops work in OpenPilot.

Figure 17: PID Dual Loop structure in OpenPilot

15

In this case there are two nested control loops: The inner Rate Loop and the outer Attitude Loop. In the rate loop the control software takes an input value which represents the desired rate of change (angular velocity) for a given axis. The difference between the current angular velocity and the input value gives the error that will be corrected. It is possible to select “rate mode” for the control of the quadcopter. In this case, only the inner loop will be used. For example, pushing the controller stick for roll all the way to the right will translate to a desired rotation of 150 deg/sec. Using the sensors (gyroscopes, magnetometers, accelerometers), the software determines the current angular velocity in the roll direction and compares it to the desired rotation speed. If necessary, the software will change the motor speeds to reach this rotational speed. If the controller stick is released (back in the center position), this translates to a desired rotational speed of 0 deg/sec. This means if the user centers the controller while the quadcopter is at an angle of 10 deg, it will try to stay at this angle. It is possible to steer the quadcopter in this mode, but the fact that a 0 position of the control does not necessarily correspond to a 0° angle means that the quadcopter will not try to level out automatically. This is not as intuitive and takes a lot of experience for a human operator. For this reason the OpenPilot software also offers “attitude mode”, which adds an outer Attitude Loop to the control mechanism. In this case the controller signal is fed as input to this loop and will be interpreted as a desired orientation (attitude) for the aircraft. The software uses the sensor data to determine the quadcopter’s current orientation. For each axis, an error is calculated between the desired and current attitude. Then, the proportional, differential and integral coefficients are used to calculate at which rate the quadcopter should try to correct the error. This calculated rate is then fed into the inner Rate Loop to be converted into motor speeds. For these PID systems, it is important to 15

Source: http://wiki.openpilot.org/display/Doc/Stabilization+Panel

20

find good values for all coefficients to ensure the stability of the quadcopter. This process is called PID tuning and there are multiple methods of obtaining ideal values. We decided to follow the method described in the OpenPilot documentation16, which is based on the step response method. Normally, the method relies on analyzing the graph of the system in response to a drastic (step) change of the desired value17. However, the OpenPilot software does not provide very detailed logging mechanisms “out of the box”, which would allow to analyze a plot of the system behavior in detail. Instead, the method relies on observing the quadcopter behavior rather than relying on the analysis of the plot. This has the disadvantage that a bit of experience with the system is necessary to judge the behavior, but we decided to directly start the PID tuning rather than try to figure out how to log information and obtain better plots. The method is further described in a tutorial video18 by the OpenPilot team, which shows in detail how to make the appropriate adjustments and what to look for in the quadcopter behavior.

Tuning Setup During PID tuning, each axis should be tuned individually. While the pitch and roll axes are likely to have (nearly) identical PID coefficients due to the symmetric layout of the quadcopter, it is still important to test each axis separately to rule out interferences in the behavior from another axis. The used method relies on fixing the quadcopter in a way that only allows rotation in one axis (e.g. pitch). We decided on tying down the frame between two chairs. One additional point to watch out for is to make sure that the tests are conducted high enough off the ground (> 50cm) to reduce the effects of ground turbulence. Our setup is shown in Figure 18.

Figure 18: PID Tuning setup

16

http://wiki.openpilot.org/display/Doc/Stabilization+Tuning++Multirotor www.hh.se/download/18.7502f6fd13ccb4fa8cb2f6/1360676928529/PID.pdf 18 http://www.openpilot.org/pid-tuning 17

21

To test the behavior of the system, the throttle is increased until the quadcopter hovers without tension on the rope. Then, the controller for the free axis is moved quickly to extreme positions and back. Based on how quickly the quadcopter responds and the presence of oscillations, the values for the PID coefficients are increased or decreased. To some degree, the behavior can be verified on the plots in the ground control station software. As mentioned in the tutorial video, the settings obtained with this setup will likely be too high and will need to be slightly corrected based on the quadcopter’s behavior in flight.

Rate Loop Tuning We started the tuning process with the pitch axis. To tune the coefficients for the rate loop, we switched the pitch axis to rate mode and the other axes off Figure 19.

Figure 19: Mode settings for rate loop tuning

Figure 20 shows the default settings for the rate loop coefficients.

Figure 20: Inner Loop default coefficients

22

As we can see, only the proportional components are set. According to the OpenPilot documentation, it is possible that integral components for both the inner and outer loops may have negative and unpredictable interactions. Therefore, we decided not to set an integral component in the inner loop and leave the coefficient at 0. The derivative component is meant to give some extra speed to the control action in case of a large change in the error. This may be useful for quick maneuvers or to balance out fast gusts of wind. Since we are planning to fly the quadcopter slowly and as stably as possible and also indoors, we decided not to set a derivative coefficient, either. With a proportional coefficient of 0,002, the quadcopter’s response was quite slow. Figure 21 shows the plot for this test. The green plot shows the pitch axis, which was being tested. The highlighted slopes show the quadcopter’s reaction to controller inputs. We can see that the slopes are not very steep, showing the slow reaction time.

Figure 21: Rate mode response with default settings

23

We increased the proportional coefficient up to the point where we could see fast oscillations during hovering. An example plot clearly showing these fast oscillations can be found in Figure 22. These were observed with a coefficient of 0,008.

Figure 22: Rate mode response with proportional coefficient too high

Finally, we settled on a proportional coefficient of 0,003 for the inner loop. This setting gave the best tradeoff between reaction speed and stability. We could have increased the value more for a more “aggressive” setting, but for our application a stable flight without high vibrations is more important than a fast reaction time of the aircraft.

Attitude Loop Tuning After the inner loop was tuned properly, we were able to start tuning the outer loop. As mentioned earlier, now we had to switch the flight mode to “attitude mode”. The default PID values for the attitude loop are shown in Figure 23.

Figure 23: Outer Loop default coefficients

24

Again, we increased the proportional coefficient until oscillations became apparent. Because these oscillations are caused by the outer loop, they are much slower than the ones observed during rate loop tuning. Figure 24 shows the oscillations resulting from a proportional coefficient of 8,0. Please note that only the highlighted oscillations were observed during hovering and caused by the PID settings. The uniform decreasing oscillations visible afterwards simply resulted from turning down the throttle and leaving the quadcopter to come to rest on the ropes.

Figure 24: Oscillations due to a high proportional coefficient in the attitude loop

Finally, we set the proportional coefficient for the outer loop to 3,5. Figure 25 shows the system behavior with well-tuned coefficients. The plot shows a quick response and only minimal overshoot.

Figure 25: System with well-tuned outer loop proportional coefficient

25

We then applied the values obtained in the pitch axis to the roll axis as well. As mentioned earlier, due to the symmetry we expected similar values to be ideal. We performed the same tests on the roll axis and decided that identical values do in fact work best. The effects of the integral value are a little more subtle, so we decided to carefully add this in flight.

In flight corrections After pitch and roll had been tuned with inner and outer loop proportional values, we decided that the setup should be stable enough to be tested in flight. Here it is worth mentioning that the yaw axis remains in rate mode. If the yaw axis is set to attitude mode, the zero position on the controller will cause the quadcopter to yaw to face 0° (north) using the magnetometer reading as a reference. This obviously is not very intuitive for flight. Also, the yaw axis does not affect stability of the aircraft as much as pitch and roll. Therefore, we set yaw control to rate mode and decided to also tune the coefficients for this axis in flight. The aircraft already handled very well with the obtained settings. We were able to improve stability further by setting the integral value for the outer loop to 0,006. Again, this value was obtained by increasing the coefficient until the quadcopter became unstable in flight. We left the proportional and integral for the yaw axis at 0,0035. One issue we kept having was the quadcopter drifting to the side immediately upon takeoff. We found some entries in the OpenPilot forums19 suggesting increasing the AccelKp value, which causes the accelerometers to be weighted higher over the gyroscopes. Changing the value from 0,05 to 0,1 helped to stabilize the quadcopter more on takeoff. A certain amount of forward drift remained, which we mitigated more by changing the virtual pitch of the controller board in the attitude settings to -3,3°.

19

http://forums.openpilot.org/topic/13528-attitude-drift/

26

Local control – OpenPilot Openpilot is a free software unmanned aerial vehicle project distributed under the GPL. This project uses a real-time operating system derived from FreeRTOS, which is an open-source operating system for microcontrollers in real time applications. Openpilot supports both helicopters and fixed-wing aircraft with the same autopilot avionics. A GUI-based Ground Control Software (GCS) is included in the project to control the various parameters of the UAV (= Unmanned Aerial Vehicle) and get a stream of flight data.20 Overview of OpenPilot Since the DiscoveryF3 board is used for the local control because of the reasons stated in the previous section, we had to move from the OpenPilot project to the TauLabs project21, because of its support for this platform. The TauLabs project is a fork of the OpenPilot project with quite a few enthusiasts working on it. For our project, we used the port of the OpenPilot for DiscoveryF3 board created and further developed by TauLabs, so in the rest of the document, when we mention OpenPilot, we mean the fork by TauLabs. This section begins with an introduction to the OpenPilot architecture which is required to explain the changes that we made for our project. UAV Objects UAV Objects are the containers which store the definition and settings of the different aspect of the UAV. Some examples of UAV objects are ADCRouting, FlightBattery, GyroBias, SystemAlarms. They are stored as XML files which is used when compiling both the GCS and the flight firmware because they need to be in sync about the UAV objects. UAV objects are a means for exchanging data between modules. An example ‘Magnetometer.xml’ file which shows how an UAV Object is stored in shown in Snippet 1.

Snippet 1: Example UAV Object xml file - Magnetometer.xml

20

“Build your own quadrotor” by H. Lim, J. Park, D. Lee, H.J. Kim, 2012

21

https://github.com/TauLabs/TauLabs

27

A protocol called UAV talk uses these UAV Object definitions for the communication between the GCS and the UAV. Because the same UAV object definition is known in both GCS and UAV, the protocol can be understood by both. Libraries Libraries provide a set of services that can be used by other libraries and modules (which are defined in the next section). A library’s services can be accessed through the Application Programming Interface exposed by it. A library won’t run in a thread of its own. All libraries must be thread safe and protect their internal state from concurrent accesses through the use of a semaphore or mutex. OpenPilot has two main libraries namely PiOS and OpenPilot. PiOS Libraries - The PiOS libraries abstract the underlying hardware in a thread safe manner. PiOS also provides access to FREERTOS’s APIs and hence is a RTOS abstraction layer too. To maintain portability of the software across different hardware the PiOS library should not have functionality specific to OpenPilot, it should be thought of as a general-purpose abstraction layer for the hardware. For example, the APIs to access the COM ports for different hardwares are added in the COM library file in PIOS library. OpenPilot Libraries - The OpenPilot libraries through its API provides thread safe access to functionality specific to the OpenPilot. However, only services that may be of use to more than one module should be added in these libraries. For example, access to the UAVTalk or UAVObjects is needed by all modules and is included in these libraries. Modules The Modules perform the actual work in the OpenPilot software. Primarily modules interface with the rest of the software in two ways: either by using the APIs from the OpenPilot and PiOS libraries or by reading and updating UAVObjects. Module to module communication is strictly prohibited, which means that the modules do not expose any public interface (except for initialization). Modules have to create a thread and run in that. Each module has a RTOS queue with which they can wait for events whenever UAVObjects of interest are updated. Modules can copy any UAVObject’s data and also update any UAVObject with new data. The dynamics of this system is such that the each module updates objects, which are required by other modules while it reads objects updated by other modules. Hence this way modules can communicate with each other. This convention also provides an easy and flexible way to add, remove or change modules without affecting the entire system as long as the concerned objects are updated. Strategy for autonomous control of local control (OpenPilot) by Global Control Communication channel In autonomous flight the responsibility of the local control is to enable the quadcopter to fly stably and that of the global control is to direct the quadcopter on where to fly and what to do next. This means that there needs to be a communication channel between the global control and local control. The primary requirement was that setting up and communicating with this channel needed to be simple in both global and local control.

28

Serial communication using Univarsal Asynchronous Receiver/Transmitter (UART) was chosen as the communication channel. Apart from UART channel being used for the telemetry communication with the Bluetooth module all the other UART channels were unused in STM32F3 microcontroller in DiscoveryF3 board used in local control. And there was easy access of the serial pins in both Raspberry Pi and Beagle board for the global control. Mode of operation of local control Among different modes such as attitude/rate stabilized and path planner, the altitude hold mode offered the best granularity with respect to the amount of control the global control has over the local control. In the AltitudeHold mode, without any inputs the quadcopter is supposed to hover at its current height. With RC transmitter, the pitch, roll and yaw are controlled as in the stabilized mode. The altitude control provides the offset required from the current altitude. This mode is optimal for the autonomous control because theoretically once the required altitude is reached this mode ensures that the quadcopter maintains this altitude. This allows the global control to guide the quadcopter in a horizontal plane with roll, pitch and yaw inputs. The altitude also can be changed if necessary with the offset provided by the desired altitude. The AltitudeHold mode depends on the the availability of current altitude of the quadcopter to maintain its altitude and is usually provided by either the barometer or the sonar sensor. Protocol for Communication As mentioned in the hardware section, the ultrasonic sonar sensor was not a reliable source of altitude of the quadcopter and was replaced by a laser pointer and downward facing webcam combination. The exact implementation of finding the altitude is explained in the section “Ultrasound sensor not reliable”. As the webcam is accessed by the global control software the current altitude information is generated there also needs to be communicated to local control, where it is needed by the AltitudeHold module. So the protocol for the communication between local and global control was as follows: Communication channel: UART at 9600 bps with 8 bits data, 1 bit start and stop and no parity bit. Each transfer consists of the following 5 bytes: [current altitude, desired altitude, yaw, roll, pitch] Current altitude and desired altitude: unsigned characters with the value representing height in centimeter m for a maximum height of 2.55 m Yaw, roll and pitch: signed characters representing rotation in degrees for a range of -127 to 128 degree Implementation The data from the serial channel should be received to provide data for the current altitude and the desired parameters. The current altitude was initially provided by sonar sensor HCSR04, so its driver was hacked to also be able to get input from the serial source. The desired parameters for the quadcopter are usually provided by the RC receiver, so the ManualControl module was adapted to provide inputs for the desired altitude, roll, pitch and yaw. Through these changes only the source of the data is changed,

29

but the rest of the operations of the OpenPilot software remains unchanged. Also, in both these places the data from the serial input is processed over the normal input source taking the input from the channel 6 toggle switch of the RC receiver i.e. the serial data is accepted only if the toggle switch is flipped. Since the data can be sent from the global control at a rate greater than the rate at which it is processed in the local control we needed to have a mechanism to process only the latest data received. Also since the data need to be accepted at two places which cannot directly communicate with each other i.e. PIOS library and module, there was a requirement of discarding the old packets without communication. This was implemented using the algorithm illustrated in the flowchart in Figure 26. The basic operation is to get the number of bytes available in the buffer, find out how many complete packets (5 bytes) are available, discard the old ones if there are more than 1 and finally process the data in the latest packets. This approach is used for retrieving data for both the current altitude and desired parameters.

30

Figure 26: Flowchart for receiving data from the global control

31

Global control – Application using OpenCV on R-Pi Global Control While the local control algorithm is used to keep the quadcopter stable when flying, the global control algorithm is used to decide the trajectory of the quadcopter. We decided to use a RaspberryPi to run our global control algorithm. The UART interface was used for interaction of the global control algorithm with the local control algorithm running on STM32 board. A downward facing camera was used to detect the movement of the quadcopter and a front facing camera was used guide the direction of motion of the quadcopter. The algorithm was used the two cameras interfaced with the Pi to implement Computer Vision. Computer vision Computer vision is concerned with modeling and replicating human vision using computer software and hardware. It is a discipline that studies how to reconstruct, interpret and understand a 3D scene from its 2D images in terms of the properties of the structures present in the scene. It combines knowledge in computer science, electrical engineering, mathematics, physiology, biology, and cognitive science. It needs knowledge from all these fields in order to understand and simulate the operation of the human vision system. Computer vision overlaps significantly with the fields of image processing and pattern recognition. Most importantly, computer vision includes techniques for the useful description of shape, volume, etc. for the so-called cognitive processing. In order to implement the global algorithm, we used OpenCV (Open Source Computer Vision Library) libraries to use computer vision. OpenCV OpenCV is an open source computer vision and machine learning software library. It was built to accelerate the use of machine perception in the commercial products by providing a common infrastructure for computer vision applications. It is available under a BSD-license and thus is free to use. Its library consists of more than 2500 optimized algorithms, which include a comprehensive set of both classic and state-of-the-art computer vision and machine learning algorithms. The library is cross-platform, has C++, C, Python, Java and MATLAB interfaces and supports Windows, Linux, Android and Mac OS. Due to these characteristics, we used OpenCV library to implement image processing in our application. Setup with Eclipse In computer programming, Eclipse is a multi-language Integrated development environment (IDE) comprising a base workspace and an extensible plug-in system for customizing the environment. Released under the terms of the Eclipse Public License, Eclipse SDK is free and open source software. We used Eclipse CDT for C/C++, which is a plugin that provides support for C/C++ languages. The advantages and features of Eclipse were the main reasons for the selection of this particular IDE.

32

The Application The aim of our application was to autonomously detect AprilTags (see subsection “Front facing camera algorithm”) pasted on the walls at a particular height in a room and to make the quadcopter travel from one marker to another. Global Algorithm Based on the differences in usage of the cameras, our global control algorithm can be split into two parts, which can be illustrated using Figure 27.

Figure 27: Software architecture of the global control

a) Downward facing camera algorithm We used the down facing camera for drift detection using Pyramidal Lucas Kanade Optical Flow algorithm and altitude detection using a LASER and the webcam. Since our application was to be implemented inside a room, we couldn’t use GPS data in order to accurately determine the position of the quadcopter. There was a high probability that the quadcopter starts drifting in random direction without the control system knowing about it. Thus, in order to identify such unnecessary movements and to improve stability of the quadcopter, we decided to use optical flow detection.

33

The Pyramidal Lucas Kanade Optical Flow algorithm actually carves the image into pyramids of different resolutions. In computer vision, the Lucas–Kanade method22 is a widely used differential method for optical flow estimation developed by Bruce D. Lucas and Takeo Kanade. It assumes that the flow is essentially constant in a local neighborhood of the pixel under consideration, and solves the basic optical flow equations for all the pixels in that neighborhood, by the least squares criterion. By combining information from several nearby pixels, the Lucas–Kanade method can often resolve the inherent ambiguity of the optical flow equation. It is also less sensitive to image noise compared to point-wise methods. On the other hand, since it is a purely local method, it cannot provide flow information in the interior of uniform regions of the image. We hoped to reduce the drift using this method and developed an application, which takes two pictures from a USB webcam and then compares these images to determine a general direction of the optical flow. Using the OpenCV standard functions we were able to first identify points of interest (cvGoodFeaturesToTrack()) using the Shi and Tomasi algorithm23. Then it was possible to track the positions of these points in the next image (cvCalcOpticalFlowPyrLK()) based on the LucasKanade method. We then calculate the mean value of the difference of these points in the x and y direction to determine the drift of the quadcopter. Based on the drift information, we implemented a simple PID controller for each direction, which tries to compensate for the drift by calculating correction angles for pitch and roll. We ran into several problems in the implementation. First, due to limited computing power of the RaspberryPi and the speed limitations of the USB webcam, the framerate of this detection is rather low. In our tests, one iteration of the program (meaning a comparison of two frames) took about 350400 ms. Originally, we were hoping for a quadcopter which is already very stable using only local control and has little drift. However, our quadcopter still has considerable drift. This may be due to inaccurate sensors, not perfectly aligned motors, RF interference and uneven power usage for the different ESCs (e.g. only one ESC powers the RaspberryPi). Turbulences created by the aircraft itself in confined rooms may also contribute. Therefore, the quadcopter may already have drifted too far between two frames to react on time and keep the points of interest found in the first frame as a reference. Another unsolved problem is the fact that a correction in pitch or roll also leads to a significant change in the field of view of the camera. This change is potentially greater than the drift the camera is supposed to recognize. This may unintentionally amplify any corrections made by the optical flow controller. We tried to correct for this by calculating the error differential and - unlike in a conventional PID controller - subtract it from the output. Due to the fact that the altitude measurement by sonar and therefore altitude hold mode didn’t work, we could not thoroughly test the system to find optimal PID coefficients. Up to this point, we focused on correcting the drift in the pitch and roll axes only. A tracking of the drift in the yaw axis might be determined by splitting the image into sections and calculating a yaw angle by comparing the relative movements in the different sections may be possible. However, we did not manage to implement this due to time restrictions. 22 23

“An iterative image registration technique with an application to stereo vision” by B.D. Lucas and T. Kanade, 1981 “Good Features to Track” by J. Shi and C. Tomasi, 1994

34

b) Front facing camera algorithm We used the front facing camera for AprilTag analysis and taking in-flight pictures or videos. AprilTags provides a framework 24 for creating and detecting fiducials (artificial visual markers). Using this framework, it is possible to detect the relative position between the camera and the marker. This includes the angles (pitch, roll, yaw) and distance to the marker. Figure 28 shows the detection of April tags in the demonstration software provided by the framework.

Figure 28: ApriTag detection

25

We tested only the AprilTags demo application for detecting markers. Some adaptations had to be made to run it on the Raspberry Pi and to select the correct camera. Similarly, we used the default raspistill and raspivideo applications26 to record in-flight time-lapse photographs and videos. The source code for these applications is available and it shouldn’t be too hard to integrate the necessary functions into one global control application. Since we couldn’t build on a stable quadcopter and global control processing seemed rather slow already, we focused on other issues rather than working on the integration of these functions.

24

http://april.eecs.umich.edu/wiki/index.php/April_Tags Source: AprilTag: A robust and flexible visual fiducial system, E. Olsen, UMich, 2011 26 http://www.raspberrypi.org/camera 25

35

Problems faced and solutions Ultrasound sensor not reliable – investigation, solutions tried, and finally use of laser pointer Originally, we planned to use a down-facing ultrasonic range finder to measure the current altitude. This is necessary for the aircraft to hold its current altitude. The OpenPilot software already contains an “AltitudeHold” module, which implements all the necessary code to hold a certain altitude. This code is again based on a PID controller, which tries to minimize the error in the current altitude by controlling the throttle. We wanted to use the HCSR04 USR sensor. The driver software for this sensor was provided by Group 3. Eventually, we managed to get the sensor to work quite accurately (in the range of +/- 1cm). However, when trying to use the altitude information provided by the ultrasonic sensor for altitude hold, we found that the quadcopter would not hold the altitude well and even had a tendency to gain altitude uncontrollably. When researching the issue, we found out that the ultrasonic sensor suffers from heavy interference by the sound of the propellers. We ruled out RF interference, etc. by using compressed air in various places around the sensor. Here we found that just the sound of compressed air leads to considerable deviations in the measured altitude - even if the airstream is not interfering with the “line of sight” of the sensor at all. Figure 29 shows the effect of turning on the motors with the propellers attached on the measured altitude. In the experiment shown, the quadcopter was fixed at a height of about 1m above the ground. However, when turning up the throttle, the measured value dropped down as far as about 55cm. We considered possible ways of correcting for the error in the sensor value by correlating it with the current throttle value, but in experiments we found that no matter what the actual altitude is, at more than about ¾ throttle, the measurement was always round 55cm. This rules out a correction by software.

Figure 29: Error in altitude measurement by sonar caused by propeller noise

36

After conceding that an altitude measurement by sonar was not possible, we decided to look for alternative methods. Eventually, we found a tutorial27 for a simple LASER range finder based on detecting the laser dot in a webcam image. Since we are already taking images with the down facing webcam and it didn’t involve further expensive hardware, we decided to use this method. It assumes that the laser dot will be the brightest red area in the image, so it will only work reliably in an indoor environment, which is not too brightly lit. Figure 30 illustrates the idea behind the system. Basically, by recording how far from the center the laser dot is detected, the angle (θ), at which the dot is observed, can be calculated. Using the angle and the well-known distance between the LASER and the camera, it is possible to calculate the distance to the plane the dot is projected on.

28

Figure 30: Schematic of the LASER range finder

The point where the laser dot moves out of the frame of the webcam image gives the minimum distance that can be measured. The image resolution limits the maximum distance, because at some point the dot will be in the center of the image and a change in distance will not result in a different coordinate of the dot anymore. The shape of the arctan function also shows that the accuracy is much higher at small distances compared to distances close to the far end of the range. Figure 31 illustrates the measured values used for calibration of the system. The figure also shows that the minimum measurable distance is 26cm and the maximum discernible distance is around 2m, which is quite sufficient for indoor use.

27 28

https://sites.google.com/site/todddanko/home/webcam_laser_ranger Source: https://sites.google.com/site/todddanko/home/webcam_laser_ranger

37

This range was obtained by finding a good compromise for the distance between the laser pointer and the camera.

Figure 31: Calibration values for the LASER range finder

Tests on several different surfaces and distances suggest that the implemented system works quite reliably overall. It was noted that on dark surfaces, the laser dot is quite small and faint; therefore a detection from distances greater than 1 meter becomes unreliable. The speed of the mechanism is limited by being incorporated in the main program cycle for the optical flow software. If run in a “standalone” setting, the processing times for one image are around 40-50ms. It was possible to speed up processing by only using a small section in the center of the image, because the x-coordinate of the dot does not change. Figure 32 illustrates how the LASER altimeter provides more reliable altitude information compared to the sonar sensor. The dark red curve shows the information coming from the ultrasound sensor. We can see that during the flight, the measured altitude remains around 55cm. However, the laser altimeter (green plot) shows that the altitude actually varied between 0,5m and 2,3m.

38

Figure 32: LASER altimeter vs. ultrasound altimeter

R-Pi camera drivers and frame rates for processing images When we started the project we planned to use the new 5.0 MP camera module, which was built specifically for use with Raspberry Pi. However, the RaspberryPi Camera Module did not provide a Video for Linux (v4l) compatible driver, which is required for OpenCV applications. This meant that we could not rely on the use of the camera module and had to proceed with development using USB webcams. Luckily, a compatible third-party driver became available over the course of the project29. Eventually, we were able to use the code developed using the USB webcams with the Camera Module.

Lack of documentation of OpenPilot Project Although there is some basic documentation about OpenPilot about the overall architecture, but it provided little detail about how each of the library and modules were implemented. Fortunately, since it is an open source project all the source code is available. However, it is quite a big project with several layers of segmentation in the code. Therefore, almost a month was required to get used to the code and understand which section implements which functionality.

29

http://www.linux-projects.org/modules/sections/index.php?op=viewarticle&artid=14

39

Project Status Assignment A: Build the Quadcopter As explained in section “Hardware Design of the Quadcopter”, we were able to construct a fully functional quadcopter. We focused on using low-cost parts, so it was possible to keep the costs about 35% lower than the recommended parts list based on last year’s hardware design. It is important to note that the constructed hardware constitutes a prototype and is not a finished mature product. The flight controller board is not protected from the top; therefore the design is sensitive to head-over crashes and moisture. Assignment B: Tune PID Values The method used for tuning the PID values for local control is described in section “PID Tuning”. We were able to complete the tuning of PID values for local control and obtain a quite stable quadcopter. Due to the sonar altimeter not working, we did not have much time to tune the PID values used for altitude hold. Also, because an ESC broke shortly before the project deadline, the final quadcopter has lost a little stability, because we had to replace the ESC with a different model. Assignment C: Hover and Take a 360° Panoramic Image We were not able to complete this task for various reasons. First of all, to be able to hover, it is also important that the “AltitudeHold” mode works. We could not quite get this to work because the ultrasonic sensor failed to give reliable distance information. We replaced this with a laser altimeter, but ran out of time before this could be tuned properly. Furthermore, we worked on an optical flow algorithm to stabilize the quadcopter. There was a lot of progress with this algorithm, but the overall setup using a USB webcam with the Raspberry Pi turns out to be not responsive enough to keep the quadcopter stable. Again, due to the lack of a reliable AltitudeHold mode, tuning of the stabilization mechanism could not be pursued further. We experimented with the Raspberry Pi camera module as a front facing camera. Using the default software, it is possible to take time-lapse pictures every 300ms and save them to the memory card. We found out that image or video recording does not much impact the performance of the RaspberryPi computer. The recorded images can later be stitched into a panoramic image. We have tried out the OpenCV sample program for image stitching. We have an operational WiFi connection to transfer the images to a laptop computer for stitching.

40

Assignment D: Follow Markers To follow markers, we experimented with the AprilTags library. The framework can be used to generate 2D markers. The libraries are based on OpenCV and have been tried out with the Raspberry Pi front facing camera. The software returns the relative position of the marker to the camera. We were not able to test an integrated marker following algorithm, because the quadcopter’s stability did not allow for test flights. We were focused on fixing the stability problems first, which led us to run out of time before a marker following algorithm could be implemented. With the possibility for transferring desired roll, pitch, yaw, and altitude values to local control, it should be rather easy to implement such an algorithm based on a stable quadcopter. It would simply be a matter of rotating around the yaw axis until a marker has been found. Then, using the angles and distance given by the AprilTags library the quadcopter can be navigated into a fixed relative position to the marker (e.g. 50cm straight in front). Then, another yaw rotation can be initiated to find the next marker. The marker IDs are also returned by the AprilTags library and can be used to identify the correct markers. Assignment E: Use Markers to Find Landing Site Depending on Battery Level For the final assignment, we were planning to monitor the quadcopter’s battery level. If a certain threshold is reached, the quadcopter should use the markers (as described in Assignment D) to find a known landing site. Once the final marker is detected, the quadcopter should be able to see another marker on the ground marking the exact landing site. With the ground marker in view of the down facing camera, the quadcopter should land exactly on the defined spot. The reasoning behind this is that now it could touch contacts for a charger and automatically recharge. We worked towards this goal by creating a battery monitoring circuit, which is a simple voltage divider, into which the battery can be plugged. The circuit divides the 12V battery voltage into a proportional voltage below 3.3V, which can be read by the flight control board’s analog-digital converters. A batterymonitoring module is already available in the OpenPilot software. In some initial tests, we failed to read a value from the AD-converter and feed it into the battery-monitoring module of the OpenPilot software. Again, due to time limitations, we were not able to follow up on this problem. However, we did implement a simple piece of software, which detects a colored marker and gives correction values to keep it centered in the image. This might eventually be expanded into an algorithm for centering on the landing site.

41

Conclusions Here are some noteworthy achievements we would like to bring to notice. ● ● ●



● ● ● ● ●

We built a fully functional & stable quadcopter at about 35% less cost than last year’s recommended parts list. We implemented the stabilization algorithm for down facing camera. We used sonar sensor in order to measure the altitude but found it to be impractical due to interference issues. One of our important contributions was an alternative altimeter implemented using a laser and a webcam. We tested its performance and found that it proved to be much better than the sonar sensor in flight. We have a standalone front facing camera, which is capable of taking panoramic shots, and detecting AprilTags. This is an important part of the global control algorithm and computer vision. Recording images or video does not significantly impact the performance of the global control software. We found some issues while working with the front facing camera. The image distortion occurs due to line-wise exposure. This happens when there are large amounts of the vibrations. Right now, the complete quadcopter system built by us is not stable enough to hover & take panoramic picture automatically. There is a need for integration of front & down facing cameras in order to implement the complete global algorithm which can be used for its autonomous flight. We still have to further develop the global control algorithm for navigation using AprilTags. This is not implemented yet, but the task will be easy once the stability of the system is improved. The overall picture is very favorable. We have been able to develop all the required parts which have been tested separately. The goal of an autonomous flight is not very far once these parts are integrated to form a single stable system.

42

Appendix A - Work division table Ashish

Matthias

Prithvi

Building the quadcopter







Work on OpenPilot (Local Control)





Configuring and testing the PID settings



Basic setup of RaspberryPi (OS, WiFi, Serial Comm)

 

Developing communication between local and global control Work on Raspberry Pi (Global Control)



 

Develop optical flow tracking Prototype - SURF algorithm for feature detection and tracking using OpenCV



Setup of RaspberryPi Camera Module



 

Implement, calibrate and test LASER altimeter Integrate ultrasonic sensor into OpenPilot



Test ultrasonic sensor



 



Integrate battery monitoring into OpenPilot Implement serial communication on RaspberryPi



Implement marker detection and centering with OpenCV



Research and Test AprilTags framework

43