VIRTUAL PATH PLANNING MODEL FOR MOBILE ROBOT NAVIGATION CHEONG ZEE SWEE UNIVERSITI TEKNOLOGI MALAYSIA

VIRTUAL PATH PLANNING MODEL FOR MOBILE ROBOT NAVIGATION CHEONG ZEE SWEE UNIVERSITI TEKNOLOGI MALAYSIA PSZ 19:16 (Pind. 1/07) UNIVERSITI TEKNOLOGI...
Author: Norah Hawkins
0 downloads 0 Views 3MB Size
VIRTUAL PATH PLANNING MODEL FOR MOBILE ROBOT NAVIGATION

CHEONG ZEE SWEE

UNIVERSITI TEKNOLOGI MALAYSIA

PSZ 19:16 (Pind. 1/07)

UNIVERSITI TEKNOLOGI MALAYSIA DECLARATION OF THESIS / UNDERGRADUATE PROJECT PAPER AND COPYRIGHT

Author’s full name :

CHEONG ZEE SWEE

Date of birth

:

870707-23-5137

Title

:

VIRTUAL PATH PLANNING MODEL FOR MOBILE ROBOT NAVIGATION

Academic Session: 2009/2010 I declare that this thesis is classified as:

CONFIDENTIAL

(Contains confidential information under the Official Secret Act 1972)*

RESTRICTED

(Contains restricted information as specified by the organisation where research was done)*

OPEN ACCESS

I agree that my thesis to be published as online open access (full text)

I acknowledged that Universiti Teknologi Malaysia reserves the right as follows: 1. The thesis is the property of Universiti Teknologi Malaysia. 2. The Library of Universiti Teknologi Malaysia has the right to make copies for the purpose of research only. 3. The Library has the right to make copies of the thesis for academic exchange. Certified by:

__________________________________ (SIGNATURE)

870707-23-5137

__________________________________

(SIGNATURE OF SUPERVISOR)

En. Mohamad Shukri bin Zainal Abidin

(NEW IC NO. /PASSPORT NO.) Date :

NOTES :

APRIL 2010

*

NAME OF SUPERVISOR Date :

APRIL 2010

If the thesis is CONFIDENTIAL or RESTRICTED, please attach with the letter from the organisation with period and reasons for confidentiality or restriction.

“I hereby declare that I had read this thesis and in my opinion, this thesis is sufficient in term of quality and scope for the purpose of awarding a Bachelor Degree in Electrical (Mechatronics)

Signature

: ...........................................................

Name

: En. Mohamad Shukri bin Zainal Abidin

Date

:

APRIL 2010

VIRTUAL PATH PLANNING MODEL FOR MOBILE ROBOT NAVIGATION

CHEONG ZEE SWEE

A thesis submitted in partial fulfillment of the requirement for the award of the degree of Bachelor in Electrical Engineering (Mechatronics)

Faculty of Electrical Engineering Universiti Teknologi Malaysia

APRIL 2010

ii

I declare that this thesis entitled “Virtual Path Planning Model for Mobile Robot Navigation“ is the result of my own research except as cited in the references. The thesis has not been accepted for any degree and is not concurrently submitted in candidature of any other degree.

Signature

: ....................................................

Name

: CHEONG ZEE SWEE

Date

:

APRIL 2010

iii

To my dearest family and friends.

iv

ACKNOWLEGEMENT

I wish to express my appreciation to the peoples who gave their helping hand when I preparing this project. Firstly, I wish to thanks to En. Mohamad Shukri bin Zainal Abidin who is my supervisor in this project. His guidance and support had speed up the progress in this project.

Next, I would like to thanks to my family. My family had given their greatest encouragement and support for all these years. I would also like to thanks to my friends who had given their helping hand to help me to finish this project.

v

ABSTRACT

Mobile Robot is the robot driven by either wheel or leg mechanism to fulfill some task in an environment. When the environment changes, the mobile robot always has to be reprogrammed to adapt the corresponding environment. The path of the mobile robot has to be rebuilt and this process is quite time consuming. This project was done to build up a path following model which consists of software and hardware.

Software is a GUI which is capable to visualize the environment

condition and do the path planning while the hardware is a robot that capable to follow the corresponding path. The path planning algorithm is based on the concept of extracting the check points around shape and detects that check points either are in the condition of collision when building the path. Robot calculates its position using the kinematic model from the step input of the stepper motor. The motion of the hardware is controlled by the fuzzy logic controller (FLC) to follow the path. The error in a short travel distance was satisfied and will increment when continue motion. This project was capable to do offline path planning and the robot was able to follow the path.

vi

ABSTRAK

Robot boleh gerak merupakan robot yang digerakkan sama ada dengan roda atau kaki mekanisme.

Apabila persekitaran berubah, robot boleh gerak akan

diprogram semula untuk menyesuaikan persekitaran tersebut. Laluan robot boleh gerak perlu dirancang semula dan proses ini akan memakan banyak masa. Projek ini bertujuan untuk menghasilkan satu model yang boleh mengikut laluan yang telah dirancang. Model ini memgandungi dua bahagian iaitu perisian dan perkakasan. Perisian

adalah

satu

antara

muka

pengguna

grafik

yang

berkebolehan

membayangkan keadaan persekitaran dan merancang laluan untuk robot manakala perkakasan adalah satu robot yang berkebolehan untuk bergerak mengikut laluan tersebut.

Perancangan laluan algoritma adalah berdasarkan konsep yang

mengekstrak titik-titik semak di sekeliling bentuk di atas peta dan seterusnya mengenalpasti titik-titik semak tersebut sama ada akan terlanggar dengan robot dalam laluan. Robot mengira kedudukannya dengan menggunakan kinematik model dari data masukkan motor pelangkah.

Pergerakkan perkakasan adalah dikawal

dengan FLC untuk mengikut laluan yang dirancang. Ralat dalam perjalanan yang dekat adalah memuaskan dan ralat ini akan bertambah apabila perjalanan diteruskan. Projek ini berkebolehan untuk merancang laluan robot dalam komputer dan robot dapat bergerak mengikut jalan yang dirancang.

vii

TABLE OF CONTENT

CHAPTER

1

2

TITLE

PAGE

DECLARATION

ii

DEDICATION

iii

ACKNOWLEDGEMENT

iv

ABSTRACT

v

ABSTRAK

vi

TABLE OF CONTENT

vii

LIST OF TABLES

x

LIST OF FIGURES

xi

LIST OF SYMBOLS AND ABBREVIATIONS

xiii

LIST OF APPENDICES

xiv

INTRODUCTION

1

1.1 Background

1

1.2 Project Objective

3

1.3 Scope of Study

4

1.4 Project Overview

4

1.5 Thesis Outline

7

THEORY AND LITERATURE REVIEW

8

2.1 Introduction

8

2.2 Theory

8

2.2.1 Kinematic Model for differential drive

9

2.2.2 Fuzzy Logic Controller

10

2.3 Literature Review 2.3.1 Simple Path Planning Algorithm for Two-Wheeled Differentially Driven

12

viii (2WDD) Soccer Robots

12

2.3.2 Hybrid Kalman Filter/Fuzzy Logic based Position Control of Autonomous Mobile Robot

13

2.3.3 Collision Free Path Planning Algorithms for Robot Navigation Problem

15

2.3.3.1 Limitation in Genetic Algorithm

15

2.3.3.2 Limitation in Ant Colony Optimization

3

2.4 Summary

16

METHODOLOGY

17

3.1 Introduction

17

3.2 Software GUI Development

18

3.2.1 Introduction to WINAPI

19

3.2.1.1 Registering Window

20

3.2.1.2 Creating a Window

20

3.2.1.3 Messages

22

3.2.1.4 Window Procedure

22

3.2.2 Introduction to OpenGL

23

3.2.3 Database

25

3.2.4 C++ Class Development

28

3.2.5 Off-line Collision Detection

29

3.2.6 Off-line Path Generation Algorithm

33

3.2.7 Fuzzy Logic Controller

35

3.3 Hardware Implementation

4

16

38

3.3.1 Microchip dsPIC30F4011

38

3.3.2 UCN5804

40

3.3.3 Embedded System (PCM – 3353)

41

3.3.4 Stepper Motor (42BYGHD439 – 02)

42

3.4 Summary

43

RESULT AND DISCUSSION

44

4.1 GUI Software

44

4.2 Circuit and Hardware

49

ix

5

4.2.1 Main board circuit

49

4.2.2 Stepper Motor Controller

50

4.2.3 Power Supply

52

4.2.4 Robot

53

4.3 Analysis of Fuzzy Logic Controller

54

4.4 Summary

54

CONCLUSIONS

57

5.1 Conclusions

57

5.2 Recommendations

58

REFERENCES

59

APPENDIX A

60

APPENDIX B

78

x

LIST OF TABLES

TABLES

TITLE

PAGE

3.1

Notations for making fuzzy logic rules

36

3.2

Rules for left motor speed

37

3.3

Rules for right motor speed

37

xi

LIST OF FIGURES

FIGURES

TITLE

PAGE

1.1

Path planning in a maze.

3

1.2

Description of project overview.

5

1.3

Flow Chart of Expected Outcome of this project.

6

2.1

Schematic robot kinematic models.

9

2.2

Structure of fuzzy logic controller.

11

2.3

Kinematics and local coordinate system.

12

3.1

Flow chart of Window creation algorithm.

21

3.2

Structure of Window Program.

23

3.3

Example of A Simple Window Created.

23

3.4

OpenGL rendering pipeline.

24

3.5

Example of OpenGL Window.

25

3.6

General data block in the database.

25

3.7

Flow chart of process for database connection in the GUI.

26

3.8

Database table used in the GUI.

27

3.9

Example of data extraction when SQL query was issued.

27

3.10

C++ classes tree developed in the GUI.

29

3.11

A brief description of vector collision detector for circle shape.

3.12

A brief description of point collision detector for square shape.

3.13

30

31

Flow chart of vector collision detector for circle shape.

32

xii 3.14

Flow chart of point collision detector for square shape.

33

3.15

Location of check points for different shape.

34

3.16

Flow chart of path planning algorithm.

35

3.17

Fuzzification for input and output variables.

36

3.18

Pin diagram of dsPIC30F4011.

39

3.19

Pin diagram of UCN5804.

40

3.20

Connection of UCN5804.

41

3.21

Snapshot of PCM-3353 embedded system.

42

3.22

A brief structure description of the stepper motor.

43

4.1

The environment building tab of the GUI.

45

4.2

Example of primitives created in the GUI.

46

4.3

An example of created map by the GUI.

46

4.4

Layout of the path generation tab in GUI.

47

4.5

Path generated in the map created in the GUI.

48

4.6.a

Path planning algorithm creates a path from start point to destination point.

4.6.b

49

Path planning algorithm creates a new path when the original path is being blocked by an obstacle.

49

4.7

Schematic for main board circuit.

50

4.8

Snapshot for the main board circuit.

50

4.9

Schematic for stepper motor controller.

51

4.10

Snapshot for the stepper motor controller circuit.

51

4.11

Schematic for the power supply circuit.

52

4.12

Snapshot for the power supply circuit.

53

4.13

Snapshot for layer of the robot.

53

4.14

Dimension of the robot.

54

4.15

A set of current position data update each time after adjustment by the fuzzy logic controller output.

55

B.1

Location of obstacle tool bar.

79

B.2

Location of window controls in environment

B.3

building tab.

79

The window controls in the path planning tab.

80

xiii

LIST OF SYMBOLS AND ABBREVIATIONS

SLAM

-

Simultaneous Localization and Mapping

GUI

-

Graphical User Interface

WINAPI

-

Window Application Programming Interface

OpenGL

-

Open Source Graphical Language Library

SBC

-

Single Board Computer

OS

-

Operating System

EKF

-

Extended Kalman Filter

GDI

-

Graphic Device Interface

FLC

-

Fuzzy Logic Controller

GA

-

Genetic Algorithm

ACO

-

Ant Colony Optimization

UART

-

Universal Asynchronous Receiver Transmitter Module

API

-

Application Programming Interface

3D

-

3 Dimensional

2D

-

2 Dimensional

SQL

-

Structured Query Language

CAD

-

Computer Aided Design

CMOS

-

Complementary Metal–Oxide–Semiconductor

VGA

-

Video Graphics Array

DIP

-

Dual In-line Package

FPS

-

Frame Per Second

ODBC

-

Open Database Connectivity

2WDD

-

Two-Wheeled And Differentially Driven

xiv

LIST OF APPENDICES

APPENDIX

TITLE

PAGE

A

Program Source Code

60

B

Operation Manual

78

B.1 Environment building

78

B.2 Path Generation

80

B.3 Transferring Path Points

81

CHAPTER 1

INTRODUCTION

1.1 Background

With the technology today, many mobile robots or physical agents have to perform various tasks and travel in a known, partially known, or unknown environment. For completing tasks given, the mobile robots need to follow the path set by the designers. For advanced automotive mobile robots, the path can be built in computer software and the robots can control their movement to the destination point. These types of robots always embedded with many sensors to measure the current position of the robot. Furthermore, there are many algorithms and theories had been proposed to compensate the error occurred in the dynamic environment. Present day robotic employs a diverse set of sensors, including cameras and ultrasound to measure the environment, and gyroscopes and accelerometers to measure the robot’s own motion. Many mobile robots today have combined this diverse set of sensor to extract the data from the environment and to compensate the error exist of each sensor.

The definition of mobile robots is the robots move about their environment using wheels, legs, or similar mechanism. They have been put to use in navigation,

2

delivering, and others robotic field. For instance, a navigation robot can be put in the application of navigating an abandoned coal mine. The ability of this autonomous navigation robots have made them being used in a dangerous situation or in the condition human unable to do.

There are some applications had applied this technology such as the rescue robot and MARS rover. In the dangerous situation, the rescue robot can be used to search for the target and find the shortest path to reach the target through the given pre-defined map.

Other types of mobile robots include unmanned air vehicles

(UAV), commonly used for surveillance, crop-spraying, and military operations, autonomous underwater vehicles (AUV), used in deep sea exploration, and planetary rovers, such as NASA’s Sojourner, a mobile robot that explored the surface of Mars in July 1997.

There are some problems invoked when the environment travelled by the mobile robot is changed.

The environment condition would also make the

destination point and path of the robot correspondingly. It would be time consuming for rebuilding paths of the robot every time the environment changes. Therefore, the condition can be resolved by planning paths of the mobile robot offline to adapt the dynamic environment. Figure 1.1 has shown the example of path planning in a maze.

3

Figure 1.1 Path planning in a maze.

1.2 Project Objective

By conducting this project, there are some objectives need to be achieved as follows: 1. To build a path planning robot model with applying the algorithm to calculate offline collision detection and path planning in a limited size map. 2. To construct a path building software to design dynamic maps for physical robot. 3. To construct a physical robot that follows the path built by the software.

4

1.3 Scope of Study

In this project, there is some scope of works included. The scope of study of this project is shown below: 1. GUI software is built to visualize different environment condition. 2. Algorithm to calculate off-line collision detection and path planning are built. 3. The communication platform is built up between the software and the robot. 4. Robot is controlled to follow the path built by the software.

1.4 Project Overview

This project is focus on building a path planning model for the mobile robot to plan the path in a known environment. The model consists of computer GUI and a physical agent which is a wheels-driven mobile robot. A brief desciption of project overview is shown as Figure 1.2.

5

Function 

Computer GUI

Physical Agent

Path calculation to offline plans the path initially.

 Follow the path calculated by the software GUI.

Figure 1.2 Description of project overview.

In the Figure 1.2, the overall project can be divided into parts which are GUI in the computer and the physical agent which is a mobile robot. The software GUI will focus on how to visualize the environment condition and path planning algorithm which is how to calculate a path to the destination location and avoid the obstacles known in the environment.

The physical agent would consist of an

algorithm that controls its motion to follow the path in order to reach the target position. To keep track the path created, another algorithm that can calculate the current position of the mobile robot is needed.

In the Figure 1.3, the flow chart of expected outcome of this project has been shown.

6

Build up a software GUI that can enable users to visualize a known environment.

Build up offline collision detection and path planning algorithm.

Build up the mobile robot.

Build up an algorithm to control motion of the mobile robot.

Build up a communication interface between software and mobile robot.

Figure 1.3 Flow Chart of Expected Outcome of This Project

In the first part of the expected outcome which is a software GUI created which enable the users to build a known environment in the pre pre-constructed map. Then, the algorithm for offline collision detection and path planning are researched. Both algorithms are built by utilizing the features of the created map. The path points created are in the x-y x coordinate pairs airs and referenced to the origin to the map. After that, a physical agent is built up and embedded with the algorithm to measure the current position of the robot and control its motion to follow the path. Finally, a communication interface is built to link link the connection between the software GUI and the mobile robot. This interface is important as the mobile robot needs to obtain the path points from the GUI.

7

1.5 Thesis Outline

In this thesis, there are 5 chapters and the content of each chapter is briefly described here. Chapter1 explains the background of the work and the expected outcome of the project. Then, chapter2 explains the theory and the literature review which had been study throughout the project. Chapter3 explains the methodology which is the methods and concept to finish the project while chapter4 explains the result including the software and hardware and discussion. Finally, chapter5 gives the conclusion and future improvement of the overall project.

CHAPTER 2

THEORY AND LITERATURE REVIEW

2.1 Introduction

In this chapter, the theories which used to build up the path planning model refer to the literature review which is the previous works that had been studied is presented.

2.2 Theory

In this project, there are two main theories to control the robot to follow the path which is the kinematic model and fuzzy logic controller. Kinematic model is used to calculate the current position while the fuzzy logic controller is used to control the robot to follow the path.

9 2.2.1

Kinematic Model for differential drive

Kinematic model is a study of the mathematics of motion without considering the forces that affect the motion. The kinematic model deals with the geometric relationships that govern the system and the relationship between control parameters and the behaviour of a system in state space. The state vector in the kinematic model is specified as Xk = [xk yk θk]T. Figure 2.1 has shown the schematic robot kinematic model.

Figure 2.1 Schematic robot kinematic models.

Reference to the figure 2.1, the kinematic model equations for the mobile robot is: ∆‫ܦ‬௞ = ∆ߠ௞ =

ೃ ಽ ∆ௗೖ ା ∆ௗೖ

(2.1)



ೃ ಽ ∆ௗೖ ି ∆ௗೖ



݂௫ = ‫ݔ‬௞ାଵ = ‫ݔ‬௞ + ∆‫ܦ‬௞ ܿ‫ߠ(ݏ݋‬௞ + ݂௬ = ‫ݕ‬௞ାଵ = ‫ݕ‬௞ + ∆‫ܦ‬௞ ‫ߠ(݊݅ݏ‬௞ + ݂ఏ = ߠ௞ାଵ = ߠ௞ + ∆ߠ௞

(2.2) ∆ఏೖ ଶ

∆ఏೖ ଶ

)

)

(2.3) (2.4) (2.5)

10

where ∆݀௞ோ ∆݀௞௅

∆‫ܦ‬௞ b

2.2.2

- Linear distance travelled by right motor. - Linear distance travelled by left motor. - Linear distance travelled by mobile robot. - Distance between two wheels.

Fuzzy Logic Controller

A fuzzy control system is a control system based on fuzzy logic. Fuzzy logic variables may have a truth value that ranges between 0 and 1 and is not constrained to the two truth values of classic propositional logic.

Sometimes a fuzzy controller is called a fuzzy logic controller (FLC) or even a fuzzy linguistic controller as it uses fuzzy logic in the quantification of linguistic descriptions. Figure 2.2 has shown the structure of a fuzzy logic controller.

11

Figure2.2 Structure of fuzzy logic controller.

From the figure above, a fuzzy logic controller consists of four components which are fuzzification, inference mechanism, rule-base, and defuzzification. Fuzzification is an interface which converts controller inputs into information that the inference mechanism can easily use to activate and apply rules.

Inference

mechanism (also called an inference engine or fuzzy inference module) which imitates the expert’s decision making in interpreting and applying knowledge about how best to control the plant. The knowledge of the environment will be stored in the rule-base and fire some rules to control output corresponding to an input. Defuzzification is an interface which converts the conclusions of the inference mechanism into a crisp set actual output for the process.

To design a fuzzy logic controller, there are five steps to be followed: 1. Defining Inputs and Outputs for the FLC. 2. Fuzzify the Inputs. 3. Set Up Fuzzy Membership Functions for the Output(s). 4. Create a Fuzzy Rule Base. 5. Defuzzify the Outputs.

12 2.3 Literature Review

2.3.1

Simple Path Planning Algorithm for Two-Wheeled Differentially Driven (2WDD) Soccer Robots (Gregor Novak1 and Martin Seyr)

In this paper, Gregor Novak1 and Martin Seyr [1] had proposed two algorithms for path planning. A mini robot with two-wheeled and differentially driven (2WDD) was used as testing robot. The kinematic of mobile robot was calculated which used the robot’s movement, the velocity vR and the angular velocity ߱ ோ , as references. The calculated value of the kinematic was used to keep track the position of the robot. Figure 2.3 has shown the kinematics and local coordinate system in the project. ߭ோ =

߱ோ =

జೃ,ಽା జೃ,ೃ ଶ

జೃ,ಽି జೃ,ೃ ஻

Figure 2.3 Kinematics and local coordinate system.

(2.6) (2.7)

13 Depending on whether the target orientation was specified or not, in this paper two different algorithms are presented. The first one calculates a path through a target position without a given orientation. Whether the target velocity and the time interval are specified or not does not affect the trajectory in this simple approach. For the second algorithm, it calculates a suitable intersection point to be reached using the first algorithm, and then calculates a circular arc through the target position. The target position can then be reached with the desired orientation. So the trajectory is calculated out of the target parameters and the present position, it consists of straight lines and circular arcs; the velocities and angular velocities are piecewise constant.

2.3.2

Hybrid

Kalman

Filter/Fuzzy

Logic

based

Position

Control

of

Autonomous Mobile Robot (Rerngwut Choomuang & Nitin Afzulpurkar)

Rerngwut Choomuang & Nitin Afzulpurkar [2] had proposed position control of

autonomous mobile robot using combination of Kalman filter and Fuzzy logic techniques. Both techniques have been used to fuse information from internal and external sensors to navigate a typical mobile robot in an unknown environment. The main operation of Extended Kalman Filter is divided into two parts, the prediction part and the correction part.

In the prediction part of the EKF predicts the future state of the system xො-k and

projects ahead the state error covariance matrix ܲ௞ି as: ‫ݔ‬ ො௞ି = ݂(‫ݔ‬௞ିଵ, ‫ݑ‬௞ ; 0,0)

ܲ௞ି = ‫ܣ‬௞ ܲ௞ିଵ‫்ܣ‬௞ + ‫ܤ‬௞Ґ‫ܤ‬௞் + ܳ௞ିଵ

(2.8) (2.9)

14 In the correction part, it performs the measurement update as shown below: ‫ܭ‬௞ = ܲ௞ି ‫ܪ‬௞் (‫ܪ‬௞ܲ௞ି ‫ܪ‬௞் + ܴ௞)ିଵ

(2.10)

ܲ௞ = (‫ܫ‬− ‫ܭ‬௞‫ܪ‬௞)ܲ௞ି

(2.12)

‫ݔ‬ ො௞ = ‫ݔ‬ ො௞ି + ‫ܭ‬௞(‫ݖ‬௞ − ℎ(‫ݔ‬ ො௞ି , 0))

(2.11)

The objective to use the Fuzzy Logic Control is to generate the velocities for both the left and the right motors of the robot, which allows the robot to move from start position to goal position. The specification of the control input and output variables have been set. These inputs are: angle θerror which is the difference between the goal angle θg and the current heading θr of robot, the distance derror which is the difference between the current position and the goal position. The calculation is a simple computation of the basic equation between two points. The outputs of Fuzzy Logic Controller are velocities of the left and right motors of the mobile robot.

An obstacle avoidance algorithm utilizing stereo vision technique has been implemented for obstacle detection. The odometry errors due to systematic-errors (such as unequal wheel diameter, the effect of the encoder resolution) or nonsystematic errors (ground plane, wheel-slip) contribute to various motion control problems of the robot. During the robot moves, whether straight-line or arc, create the position and orientation errors which depend on systematic or non-systematic odometry errors. The main concern in most of the navigating systems is to achieve the real-time and robustness performances to precisely control the robot movements. The objective of this research is to improve the position and the orientation of robot motion. From the simulation and experiments, it has been proved that the proposed mobile robot moves from start position to goal position with greater accuracy avoiding obstacles.

15 2.3.3

Collision Free Path Planning Algorithms for Robot Navigation Problem. (Kyung min Han, August 2007)

In this thesis, the results of detailed investigation of the Genetic Algorithm, (GA) and Ant Colony Optimization, (ACO) algorithms being applied to a path optimization problem have been presented.

It has been demonstrated that both

approaches have a great potential of being a solution to the proposed problem. In addition, it has been discovered that employing elitism led the algorithms to find a solution more easily. However, in a majority of cases the GA found a better solution than did the ACO algorithm in terms of the performance. Such a result is due to the fact that in the ACO algorithm each ant only proceeds 1 grid point at a time while this is unnecessary in the GA case.

Despite the good performance for both

algorithms, some limitations were found. Overcoming these limitations represent a challenge for future research. The two subsections below describe some of the limitations of the proposed algorithms along with possible future work

2.3.3.1 Limitation in Genetic Algorithm

In order to guarantee obstacle avoidance, it is necessary to impose another constraint on the algorithm. That is, looking for any intersection point created by the path line and the obstacles can create a significant amount of computation. If only a few obstacles block the optimal path, this computation issue can be ignored. However, when a large number of obstacles exist in the map, this algorithm will require a significant amount of computation time. Although it turned out that GA is better than ACO algorithm in terms of finding an optimal path, GA algorithm may pay for that advantage in terms of computation time; this limitation needs to be addressed.

16 2.3.3.2 Limitation in Ant Colony Optimization.

The ACO algorithm has an advantage over the Genetic algorithm in terms of the algorithm execution time.

No matter how many obstacles are present, this

algorithm does not devote an in ordinate amount of time in iteration process. However, this method is inferior to GA method in the sense that ACO approach takes some unnecessary steps, so that the algorithm does not return the best solution. Furthermore, a global attraction term had to be added to lead ant to reach the goal point. Eliminating this term may cause not only the ant wander around in the map, but also the ant may become stuck at a point.

2.4 Summary

This chapter presents the theories and literature review that had been done to finish this project. In the theories session, kinematic model for a differential drive mobile robot and fuzzy logic controller has been discussed. In the literature review session, there are three main papers had been reviewed to study the methods to do the path planning algorithm.

CHAPTER 3

METHODOLOGY

3.1 Introduction

In this project, the main objective is to develop a path planning model. The model consists of both hardware and software part. The hardware part is a physical agent which will follow the path calculated by the software GUI. The physical agent which is a mobile robot that capable to keep track the path generated by the software.

In order to build up a software GUI to visualize the process of path planning of a mobile robot, the GUI software was planned to be built by combining the Wind32 API and the low level graphical language, OpenGL. Therefore, the main programming language utilized in this project would be the C and C++ programming language. In the GUI interface, there are two algorithms to be developed. One of them is the software collision detection while another is the path planning algorithm to avoid the obstacle in the map definition.

18 Between the software GUI and the physical agent, there needs a link or communication methods to ensure the data can flow in mutually way. The GUI interface is inserted into the embedded system while the main controller for the motion of the robot is the microcontroller dsPIC30F4011. Therefore, UART was used as the mean of communication as both systems consists of the UART hardware mechanism.

For the physical agent, the position measurement algorithm should be embedded. Due to the physical agent have to recognize its position and orientation from time to time in order to follow the specified path, the step input data from stepper motor is integrated to calculate the robot current position.

3.2 Software GUI Development

The main purpose of the computer GUI is to provide a platform to calculate a path of an agent to a specified destination location. The basic layout of the GUI would be drawn by using the Win32 API (window application programming interface) library. This library provides functions to build up a basic requirement of a GUI which is a window platform and other controls such as buttons and textbox in C/C++ programming language. To visualize the 3D path, the low level graphical programming language, OpenGL, used to model the map and the path in a 3D style.

19 3.2.1 Introduction to Win32 API

Win32 API is a library consists of functions on window creation and all the functions are written in the C or C++ programming language.

For a minimal

Windows program that just uses the Windows API, there are two functions will be written. These are a WinMain() function, where execution of the program begins and basic program initialization is carried out, and a WindowProc() function that is called by Windows to process messages for the application. The WinMain function is the entry point to the Windows application. The WinMain function initializes the application, shows the application window on the screen and enters the main loop. Following is the function prototype of WinMain function:

int WINAPI WinMain( HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int nCmdShow );

Form the function prototype, the first parameter, hInstance is the handle of instance. It is a 32-bit number identifying the instance of our program or application within the OS environment. This number is given by Windows, when the program starts executing. The second parameter, hPrevInstance is always NULL. It is a legacy from 16-bit Windows.

Windows programs can still be started from the

command line. The parameters given are stored in lpCmdLine. The nCmdShow value specifies how the window will be displayed such as minimized, maximized or hidden. message.

The WinMain function terminates, when it receives the WM_QUIT

20 3.2.1.1 Registering window

Before creating a window, the window must be registered within the Windows. All windows must be registered. Programming in C or WINAPI means a lot of working with structures. To register a window, a WNDCLASS structure must be created first and assign the attributes inside the structure such as setting the window style, extra allocation bytes, window class name, handle of the program instance, background brush, optional menu name, window procedure, handle of the cursor and icon. After that, the RegisterClass() function should be called in order to register the window class structure.

3.2.1.2 Creating a window

The window is created by calling the CreateWindow() function.

HWND CreateWindow( LPCTSTR lpClassName, LPCTSTR lpWindowName, DWORD dwStyle, int x, int y, int nWidth, int nHeight, HWND hWndParent, HMENU hMenu, \

HINSTANCE hInstance, LPVOID lpParam );

The lpClassName uniquely identifies the window. It is the window class name which is specified in the window class structure. The lpWindowName is a window name. It can be title of the window in parent windows or a label in child windows like button or static text. Windows can be created using several styles. For

21 this, we have the dwStyle parameter. The x, y specifies the initial horizontal and vertical position of the window. The nWidth and nHeight specify the window width and height. The hWndParent is a handle to the parent window. For windows that do not have parents, assign NULL to this handle. For a parent window the hMenu is an optional handle to the menu, for a child window, it is a control identifier. The hInstance is a handle to the program instance. The lpParam is the last parameter, it is an optional value passed to the window during the WM_CREATE message. The CreateWindow() function returns window handle to the newly created window. Figure 3.1 has shown the flow chart for window creation.

START Defining Window Class

Registering the Window Class.

N

Is Registering Successful?

Destroy Window and Unregister Window Class

Y Create Window from the Registered Class.

Is Window creation Successful?

N

Y END Figure 3.1 Flow chart of Window creation algorithm.

22 3.2.1.3 Messages

The WinMain function creates a message loop. It is an endless cycle, which runs during the life of the application. Message loop is a programming construct that waits for and dispatches events or messages in a program. Windows communicate using messages. A message is an integer value that identifies a specific event. It may be a button click, resizing of the window or closing of an application. There can be multiple messages created in a moment. The messages cannot be processed all at the same time. Therefore there is a message queue. The message enters the message queue and waits until it is processed. The PeekMessage() function retrieves the message from the message queue. The DispatchMessage() function dispatches a message to a window procedure. If the application obtains character input, we include the TranslateMessage() function in the loop.

3.2.1.4 Window procedure

LRESULT CALLBACK WindowProc(HWND hwnd, UINT uMsg, WPARAM wParam, LPARAM lParam);

Every window must have a window procedure. It is a function that receives messages. The hwnd is a handle to the window that received the message. The uMsg is the message.

The wParam and lParam parameters provide additional

message information. The messages come from the user or from the Windows OS. We react to a message or we call the default window procedure to provide default processing. Most messages are sent to the default window procedure. The default window procedure is called DefWindowProc(). It is called with the same parameters

23 as the normal window procedure. Figure 3.2 shown the structure of a window application while the Figure 3.3 shown an example of simple window.

Figure 3.2 Structure of Window Program.

3.2.2

Figure 3.3 Example of A Simple Window Created.

Introduction to OpenGL

OpenGL provides the programmer with an interface to graphics hardware. It is a powerful, low level rendering and modelling software library, available on all major platforms, with wide hardware support. It is designed for use in any graphics applications, from games to modelling to CAD.

OpenGL is a collection of several hundred functions providing access to all the features offered by your graphics hardware.

Internally, it acts as a state

machine—a collection of states that tell OpenGL what to do. Using the API, you can set various aspects of the state machine, including such things as the current colour, lighting, blending, and so on. When rendering, everything drawn is affected by the current settings of the state machine. It’s important to be aware of what the various

24 states are, and the effect they have, because it’s not uncommon to have unexpected results due to having one or more states set incorrectly. At the core of OpenGL is the rendering pipeline, as shown in Figure 3.4. WHAT IS OPENGL?

Figure 3.4 OpenGL rendering pipeline.

Under Windows, OpenGL provides an alternative to using the Graphics Device Interface (GDI). GDI architects designed it to make the graphics hardware entirely invisible to Windows programmers. This provides layers of abstraction that help programmers avoid dealing with device-specific issues. OpenGL allows user to bypass GDI entirely and deal directly with graphics hardware. Figure 3.4 has shown an example of OpenGL Window.

25

Figure 3.5 Example of OpenGL Window

3.2.3

Database

Database is a collection of data and was used to store the data related to this project such as Map created and Cartesian coordinate for the path generated in the software by using odbc library connects to the Sybase driver. Manipulation data in the database is based on the SQL (Structured Query Language) command. Structured Query Language is the standard language used to communicate with a relational database. Figure 3.6 has shown the general data block in the database.

Figure 3.6 General data block in the database.

26 There are some common commands used in the project to save data into corresponding table which are INSERT, DELETE, COMMIT.

The INSERT

command is used to insert a new row data into table while DELETE command is used to delete a row data exists in the table. All the table update procedure must be follow by a COMMIT transaction command to save the update data in the table.

To connect to the database in the software GUI, there are two prerequisites. One of the prerequisites is the database connection driver and the C programming database API. The database connection driver used for the database storage is the Sybase driver while the database API was ODBC API. The flow chart of the database connection proccess was shown by Figure 3.7.

Allocate memory for Database Environment handle.

Set the attribute for the database environment.

Allocate memory for Database Connection handle.

Connect to the database driver.

Allocate memory for Database Statement handle.

Using Database Statement handle to execute SQL command.

Figure 3.7 Flow chart of process for database connection in the GUI.

27 In the database, there are three main tables used to manipulate the data from the GUI. Figure 3.8 has shown the database table in the GUI. The first table is Map which consists of information how to load map information into the GUI. The second table is Path which consists of path points coordinate information generated corresponds to a map. The last table is the Selected_Path which consists of the information which path will be extracted while there is a request from the mobile robot. Figure 3.9 has shown the example of data extraction while a SQL query was issued.

Map     

MapID Primitives Origin Orientation Attributes

Path    

MapID PathID Number of points Points Coordinate

SELECTED_PATH  

MapID PathID

Figure 3.8 Database table used in the GUI.

Figure 3.9 Example of data extraction when SQL query was issued.

28 3.2.4

C++ Class Development

In the software architecture, there are two main classes to construct the whole structure of software: CWindow class and Primitives class.

CWindow is the base class to be inherited by other window object classes and all the attributes in the CWindow base class can be accessed by its child classes. The child classes that have inherited the CWindow base class are the WindowForm, GLWindow and Control. To build up the window control objects, the Control class can be further developed to create the window class object. The PathGUI which has inherited the WindowForm class is the GUI class for this project. In this class, it will contain all the variables that need to build up the software.

Primitives class is another base class created to hold the common attributes and behaviours for each shape classes. The common attributes such as origin point and orientation for a shape can be extracted by the functions in this base class. The shape classes inherit this base class are CLine, CCurve, CQuadric, CPlane, and CCube. All the shapes are drawn by applying the OpenGL API .

To manipulate the primitives in a 3D space, Vector namespace was created to manipulate the behaviours of the primitives. In the Vector namespace, the vector functions in a 3D space were included such as operations between 2 vectors and operations between vector and point. Figure 3.10 has shown the C++ classes tree tha had been developed in the GUI.

29 Base Class

CWindow

: Inherit GLWindow

WindowForm

ComboBox

Button

ListBox

ToolBar

Window Controls

Control

Path Planning GUI

PathGUI

StatusBar

StaticLabel

TextBox

TabBox

Primitives

CCube

CQuadric

CLine

CPlane

CCurve Namespace

CCylinder

CDisk

Vector

CSphere

Figure 3.10 C++ classes tree developed in the GUI.

3.2.5

Off-line Collision Detection

Algorithm used in the software collision detection is a combination of point detector and the vector detector. Primitives (basic shape object such as rectangle, cylinder, cube, and sphere) are being projected into a 2-D scheme and then divided into two categories which are square shape and circle shape. To recognize a collision for a circle shape, a vector detector was used.

Figure 3.11 has shown a brief

description of vector collision detector for circle shape.

30

C

B d

b

θ a

A

Figure 3.11 A brief description of vector collision detector for circle shape.

A

-

Start point

B

-

End point

C

-

Origin point of the obstacle

a

-

Vector from point A to point B

b

-

Vector from point A to point C

d

-

Orthogonal distance from vector a to vector b

θ

-

Angle between vector a and vector b

The shortest distance, d will be tested to determine whether a collision will be occurred. If d is less than the radius of the circle, a collision will occur or else if d is greater than the radius of the circle, there will not be a collision. The below formula presented the calculations: a=B–A = (XB, YB) – (XA, YA) = (XB – XA, YA – YB) b=C–A = (XC, YC) – (XA, YA) = (XC – XA, YC – YB)

cos θ =

ୟ .ୠ |ୟ||ୠ|

ୟ .ୠ

θ = cos-1( |ୟ||ୠ| )

31 sin θ =



|ୠ|

d = |b| sin θ

To determine collision for a square shape, the vector from start point to end point would be divided into many fragment points. The fragment points in time k would be tested either it is inside the boundary of the rectangular shape. Figure 3.12 has shown a brief description of point collision detector for square shape.

width height Pk

Figure 3.12 A brief description of point collision detector for square shape.

When a point, Pk was found inside the width and height boundary of the rectangular shape, the vector from start point to the end point would experience a collision with the rectangular obstacle. Figure 3.13 and 3.14 have shown the flow chart of two collision detector algorithm.

32

Calculate the vector from start point to target point. Calculate the vector from start point and obstacle origin. Calculate the shortest distance between two vectors. Is radius of obstacle greater than shortest distance? Return true (Collision occurs.)

Return false (Collision does not occur.)

Figure 3.13 Flow chart of vector collision detector for circle shape.

33

Calculate the vector from start point to target point. Divide the vector into k small fragment point. Is ith fragment point in the boundary of square shape?

Increment i.

Y N Return true (Collision occurs.)

N Is i equal to k? Y

Return false (Collision does not occur.)

Figure 3.14 Flow chart of point collision detector for square shape.

3.2.6

Off - line Path Generation Algorithm

The primitives such as plane, cube, cylinder, disk, and sphere which were generated in the OpenGL window will be projected into 2D forms to generate path automatically. To ease the calculation, all check points were extracted from the primitives before the path generated. The check points of a primitive are those points around the shape. Since all primitives were projected into 2D forms, it can be categorized into 2 shapes which were square and circle shape. The check points for these particular shapes used in the algorithm are shown in below figure:

34

Figure 3.15 Location of check points for different shape.

The close signs around the shape were the corresponding check points to the specified shape. To avoid collision, the distance from check points to the shape must be greater than the physical agent half size width.

The path control points are built by the check points extracted from the primitives in the map. Figure 3.16 has shown the flow chart on path planning algorithm.

35 Start

Initialize path control points and set target point as first control point. End

Is Collision detected between ith control point and start point? Y Y

Is jth check point equal to ith control point?

Increment j.

N Y

Is collision detected between jth check points with ith control point. N

N

Is distance between buffer and start point is nearer than distance between j check points? Y Save jth check points into buffer. N Is j equal to number of check points? Y Increment i and save buffer into ith control point.

Figure 3.16 Flow chart of path planning algorithm.

3.2.7

Fuzzy Logic Controller

The fuzzy logic controller is design to control the speed of motor of the robot to follow the path. There are two input variables and two output variables for the

36 controller. The input variables are distance error and angle error to a destination point while the output variables are the speed for the left motor and right motor respectively. Table 3.1 has shown the notations used to make fuzzy logic rules while figure 3.17 has shown the fuzzification for input and output variables.

Table 3.1 Notations for making fuzzy logic rules. VN Very Near NL Negative Large LN Little NS Negative Near Small M Middle ZE Zero E LF Little Far PS Positive Small VF Very Far PL Positive Large

Fuzzy membership functions of input distance error.

Fuzzy membership functions of output left motor speed.

ST

Stop

B

Backward

SF

Slow Forward

FB

Fast Backward

F

Forward

FF

Fast Forward

SB

Slow Backward

Fuzzy membership functions of input angle error.

Fuzzy membership functions of output right motor speed.

Figure 3.17 Fuzzification for input and output variables.

37 The next step to design a fuzzy logic controller is to create a rule-base based on the knowledge of environment. The rule-base created to control the plant system which is to control the mobile robot to keep track a path. There are two tables for rules which one is for left motor speed while another is for right motor speed. Table 3.2 and 3.3 have shown the rules-base designed for the mobile robot.

Table 3.2 Rules for left motor speed. Angle Error

NL

NS

ZE

PS

PL

VN

ST

ST

ST

ST

ST

LN

FF

SF

SF

ST

ST

ME

FF

F

F

SF

SF

LF

FF

FF

FF

SF

SF

VF

FF

FF

FF

SF

SF

Distance Error

Table 3.3 Rules for right motor speed. Angle Error

NL

NS

ZE

PS

PL

VN

ST

ST

ST

ST

ST

LN

ST

ST

SB

SB

SB

ME

SB

SB

B

B

FB

LF

SB

SB

FB

FB

FB

VF

SB

SB

FB

FB

FB

Distance Error

In this project, the defuzzification method used is the weight average method. This method is valid only for symmetrical output membership functions. The weight average method is formed by weighting each membership function in the output by its respective maximum membership value. Below is the algebraic expression for the defuzzification method:

38

ܼ∗ =

∑ ௨೎(௭̅ ).௭̅ ∑ ௨೎(௭̅ )

where ‫ݑ‬௖(‫)̅ݖ‬ ‫̅ݖ‬ ܼ∗

- maximum membership value. - output value for the maximum membership value. - crisp set value for actual output value.

3.3 Hardware Implementation

In this project, there are several hardware devices will be applied such as embedded system, microchip and Stepper to build up the mobile robot. Embedded system acts as a computer system to hold the GUI platform while the microchip acts as microcontroller to control the motion of the mobile robot.

3.3.1

Microchip dsPIC30F4011

Figure 3.18 has shown the pin diagram for dsPIC30F4011.

39

Figure 3.18 Pin diagram of dsPIC30F4011.

The main processor for the stepper motor control and the position measurement of the physical robot is the microchip dsPIC30F4011. The features of this processor which has been used are such as timer/counter, and UART. Timer/counter is used to manipulate the period of step input provides to the motor driver UCN5804 while UART is utilized as a communication interface with the embedded system.

This microchip contains the program of fuzzy logic controller in C programming language.

To write the code into the microchip, there are some

development tools such as MPLAB and PICKIT2 had been used. MPLAB was used to develop the source code to control the mobile robot while the PICKIT2 was used to write the code into the microchip.

40 3.3.2

UCN5804

UCN5804 is an unipolar stepper motor translator/driver. Figure 3.19 has shown the pin diagram for the UCN5804.

Figure 3.19 Pin diagram of UCN5804.

UCN5804 has combined low-power CMOS logic with high-current and highvoltage bipolar outputs, this driver provides complete control and drive for a fourphase unipolar stepper motor with continuous output current ratings to 1.25 A per phase (1.5 A startup) and 35 V.

There are three stepper-motor drive formats are available which are wavedrive (one-phase), two-phase, and half-step.

The wave-drive format consists of

energizing one motor phase at a time in an A-B-C-D (or D-C-B-A) sequence. This excitation mode consumes the least power and assures positional accuracy regardless of any winding inbalance in the motor. Two-phase drive energizes two adjacent phases in each detent position (AB-BC-CD-DA). This sequence mode offers an improved torque-speed product, greater detent torque, and is less susceptible to

41 motor resonance. For the half-step drive format, its excitation alternates between the one-phase and two-phase modes (A-AB-B-BC-C-CD-D-DA), providing an eightstep sequence. Figure 3.20 has shown the connection of the UCN5804.

Figure 3.20 Connection of UCN5804.

3.3.3

Embedded System (PCM – 3353)

Figure 3.21 has shown a snapshot of the PCM-3353 embedded system.

42

Figure 3.21 Snapshot of PCM-3353 embedded system.

Embedded system was used to hold the GUI software in a XP operating system (OS) on the physical agent. Equipped with an 8GB compact flash memory card, which is enough to embed a XP platform as a main operating system. The calculation of a desired path will be done by users through connecting the PCM-3353 to the VGA and other input devices. This small computer processing unit then save the x-y path coordinate points into the database and extract them when microcontroller sends a request signal to the embedded system. The embedded system consists of 4 com port which was used to communicate with the microcontroller through UART mechanism.

3.3.4

Stepper Motor (42BYGHD439 – 02)

Figure 3.22 has shown a brief structure description of the stepper motor.

43

Figure 3.22 A brief structure description of the stepper motor.

This stepper model requires a 0.8A to operate as normal and consists of six lead wires. The hold torque for this stepper model is 2.5 kg.cm. The six lead wires are extended to connect to the UCN5804 which is the stepper motor driver to control the speed and drive format of the stepper motor.

3.4 Summary

This chapter presents the methodology for the project and is divided into two main parts which are software part and hardware part. The software part includes the programming language and components needed to build up the software GUI and the algorithm to control the mobile robot.

The components are including collision

detection and path calculation algorithm and database implementation.

The

hardware part consists of the integrated circuit and the motor used to build up the mobile robot.

CHAPTER 4

RESULT AND DISCUSSION

4.5 GUI Software

The GUI software created in this project has the functions of visualize the real environment condition and do the offline path planning. To build up this two main functions, the GUI software are divided into two tabs which are the environment building tab and the path generation tab. Figure 4.1 has shown the environment building tab.

45

Performance Measurement for OpenGL Window. Environment Building Window Controls.

Figure 4.1 The environment building tab of the GUI.

In the environment building tab, there are a menu and textboxes created in order to create the primitives (basic shape created in the OpenGL window) in the pre-constructed constructed map. There are a number of primitives that the GUI capable to construct uct and the Figure 4.2 has shown the example of those primitives.

46

Plane

Cube

Cylinder

Disk Sphere Figure 4.2 Example of primitives created in the GUI.

In the Figure 4.2, the primitives which can be generated by software GUI are plane, cube, cylinder, disk and sphere. The primitives are generated based on the attributes set by user through the window controls in the environment building tab. From these primitives, there are a number of maps can be created in the GUI software. Figure 4.3 has shown an example of map created.

Figure 4.3 An example of created map by the GUI.

There is a performance measurement created in the top right corner of the OpenGL window. The performance measurement measures how fast the OpenGL window can be refreshed and the unit of measurement is in frame per second (FPS).

47 The second tab in the software GUI is the path planning tab which consists of the window controls created to enable users users to set the start point and destination point and calculate the path from start point to the destination point. Figure 4.4 has shown the layout of the path generation tab of the GUI.

Purple: Start Point.

Yellow: Target point.

Set for start point and target point.

Generate and save path as path followed by robot. Map Origin Figure 4.4 Layout of the path generation tab in GUI.

In this path generation tab, the user able to set the start point and destination point in the map in order to calculate a path from the start point to the destination point. All points coordinate is in x-y x y coordinate pair and referenced to the map origin located in the centre of the map. From the Figure 4.4, the start poi point set would be in purple colour while the destination point would be in yellow colour. After insert the start point and destination point, there is a button located in the bottom left of the GUI to enable to initialize the path calculation function. The Then, a path would be generated from the start point to the target point. Figure 4.5 has shown the path generated in the created map in the GUI.

48

Figure 4.5 Path generated in the map created in the GUI.

There are some features that the path planning algorithm developed can be performed. The path that avoids the obstacles in the map can be searched by the path planning algorithm. Furthermore, the original path generated will be replaced by a new path when the original path is being blocked by another obstacle. All the paths are built in the x-y coordinate pair and then saved in the database. When the mobile robot request for the path points, the GUI software would extract the path points from the database and the data is transferred to the mobile robot through UART communication interface. Figure 4.6 a and b have shown the features of the path planning algorithm.

49

Figure 4.6a Path planning algorithm creates a path from start point to destination point.

Figure 4.6b Path planning algorithm creates a new path when the original path is being blocked by an obstacle.

4.2 Circuit and Hardware

4.2.1

Main board circuit

Figure 4.7 has shown the schematic for main board circuit. Main board is the main controller for the mobile robot. It consists of microchip dsPIC40F3011 as the main microcontroller to hold the program of the system. The microchip is fed with a crystal of 10MHz into the oscillation pin. Furthermore, main board is used to output the he step output to stepper motor controller. Figure 4.8 has shown the real product of the main board.

50

Figure 4.7 Schematic for main board circuit.

Figure 4.8 Snapshot for the main board circuit.

4.2.2

Stepper Motor Controller

Figure 4.9 has shown the schematic for the stepper motor controller. The stepper motor controller consists of UCN5804 as the controller to control the stepper motor speed. In the UCN5804, there are three operation mode can be selected to

51 operate the stepper motor which are wave-drive (one-phase), two-phase, and halfstep. To make the motor controller can switch between these three operation modes, a DIP switch was used in order to make the selection. Figure 4.10 has shown the real product of stepper motor controller.

Figure 4.9 Schematic for stepper motor controller.

Figure 4.10 Snapshot for the stepper motor controller circuit.

52 4.2.3

Power Supply

Figure 4.11 has shown the schematic of power supply. The schematic is divided into two parts which are power supply for embedded system and power supply for main board. The power supply for embedded system is slightly different due to its requirement of high current. Extra circuit is added to boost up the original limit of current can pass through the regulator. Figure 4.12 has shown the real product for the power supply circuit.

Figure 4.11 Schematic for the power supply circuit.

53

Figure 4.12 Snapshot for the power supply circuit.

4.2.4

Robot

The mobile robot consists of three layers and each layer is placed with different circuit. Figure 4.13 has shown a snapshot on each layer of the robot. The first layer consists of the main board and stepper motor controller circuit while in the second layer, the embedded system is placed. In the third layer, the power supply and battery are placed.

First Layer

Second Layer Figure 4.13 Snapshot for layer of the robot.

Third Layer

54 The dimension of the robot is 23x20x16cm which means that the width of the robot is 23cm, length of robot is 20cm and height is 16cm. Figure 4.14 has shown the dimension of the robot.

4cm

16cm

23cm

20cm Figure 4.14 Dimension of the robot.

4.3 Analysis of Fuzzy Logic Controller

Fuzzy logic controller was design to control the motion of the mobile robot to keep track the path created by the GUI software. Figure 4.15 has shown a set of current position data update each time after adjustment by the fuzzy logic controller output.

55

Figure 4.15 A set of current position data update each time after adjustment by the fuzzy logic controller output.

In the Figure4.15, the current position data measured by the robot using kinematic model has been obtained in the computer through the UART communication interface. As referencing to the state variable which is (x, y, angle) in the kinematic model, the start point was set to (-1, ( 0, 43o) and the destination point was set to (40, 40, 45o). Finally,, the mobile robot stop position was at the (35, 40, 45o).

The accuracy of the fuzzy logic controller was computed. ଶ + 40ଶ Target Distance =√40 =

= 56.5685

Travel Distance = √35ଶ + 40ଶ = 53.1507

Percentage of Error =

ହ଺.ହ଺଼ହିହଷ.ଵହ଴଻ ହଷ.ହ଺଼ହ

= 6.38%

x 100%

56 The percentage of error for the fuzzy logic controller was about 6.38% referenced to the current position data measured by the mobile robot. The real position error would be increased due to the incremental error of the step input of stepper motor.

4.4 Summary

This chapter presents the result and discussion for the overall project. The result of this project consists of two parts which are software GUI and hardware. The software GUI part describes the GUI created which has the capability to visualize the environment and do the path planning. The hardware is the resultant circuit and the dimension of the mobile robot. The discussion part focuses on the analysis of accuracy for the fuzzy logic controller to control the motion of the mobile robot to the target position.

CHAPTER 5

CONCLUSIONS

5.1 Conclusions

In conclusions, the developed software GUI capable to visualize the real environment condition with the primitives created by OpenGL graphical language. The types of primitives can be generated are line, curve, cube, cylinder, disk and sphere. With the primitives created, the real environment can be placed in the fixed virtual map in the software GUI. The software GUI also has the feature of saving and loading the previous generated map.

Then, the path could be generated based on the start point and target point set by users. The path points would be calculated by applying the collision detection and path calculation algorithm developed.

Both algorithms would function to

generate a set of path control points and this control points could be transferred into the mobile robot. After transferring the path points into the mobile robot, the mobile robot is capable to move following the path planned in the software GUI. The final stop position of the mobile robot would slightly different with the position set by users in the software GUI due to the incremental error of the step input from stepper motor and environment error.

58 5.2 Recommendations

For further improvement, the path calculation algorithm can be focused on searching the shortest distance to the destination point. The shortest distance will reduce the time to reach the destination point.

The software GUI also can be

modified to be more users-friendly by adding some features such as enable adjusting the map size by users, modifying the position of the primitives by dragging and others.

The physical agent also can be equipped with other position sensors to measure the current position of the robot more accurately.

With the advanced

position sensors, the mobile robot can reach the destination point more accurately in real time application. To reduce the input signal noise while measuring current position of the mobile robot, some filter algorithm such as kalman filter and particle filter can be installed in programming of the mobile robot.

59

REFERENCES

1. Gregor Novak1, Martin Seyr , Simple Path Planning Algorithm for TwoWheeled Differentially Driven (2WDD) Soccer Robots, Vienna University of Technology. 2. Rerngwut Choomuang, Nitin Afzulpurkar , Hybrid Kalman Filter/Fuzzy Logic based Position Control of Autonomous Mobile Robot, Asian Institute of Technology, School of Advanced. 3. Kyung min Han, Collision Free Path Planning Algorithms for Robot Navigation Problem. University of Missouri-Columbia, August 2007. 4. CV Reference Manual, OpenCV Document. 5. Ivor Horton, Ivor Horton’sBeginning Visual C++ 2008, Indianapolis, Indiana: Wiley Publishing, Inc. 2008. 6. Kevin Hawkins, Dave Astle, OpenGL Game Programming, Roseville, California: Prima Communications, Inc. 2001. 7. Kok Seng CHONG, Lindsay KLEEMAN, Accurate Odometry and Error Modelling for a Mobile Robot, Monash University, April 1997. Clayton Victoria: IEEE. 1997. 2783 – 2788. 8. M. S. Keir, C. E. Hann, J. G. Chase, and X. Q. Chen, A New Approach to Accelerometer-based Head Tracking for Augmented Reality & Other Applications, University of Canterbury. 9. Ryan Stephens, Ron Plew, Arie D. Jones, Sams Teach Yourself SQL 24 in Hours. 4th. ed. Indianapolis, Indiana: Sams Publishing. 2008. 10. H. M. Deitel, P. J. Deitel, C How to Program, 2nd ed. New Jersey: Prentice Hall. 1994

60

APPENDIX A

Program Source Code Main Board Program

#include #include #include #include "adc.h" //===================================PIC Configuration===================================== _FOSC(CSW_FSCM_OFF & XT_PLL8); _FWDT(WDT_OFF); _FBORPOR(PBOR_OFF & MCLR_DIS); //===================================Define Constant======================================= /*For stepper motor driver PWM2H/RE3 --> right step output(pwm output from PIC) -->change to normal I/O PWM1H/RE1 --> left step output(pwm output from PIC) --> change to normal I/O RE2 --> right direction bit(output from PIC) RE4 --> left direction bit(output from PIC) *Duty cyle pulse width minimum 3us (full step, half phase) ===========================*/ #define LPULSE _LATE1 #define RPULSE _LATE3 #define RDIR _LATE2 #define LDIR _LATE4 /*For imu AN0 --> Y-RATE AN1 --> X-RATE AN4 --> ACCEL-X AN5 --> ACCEL-Y AN6 --> ACCEL-Z RD1 --> ST ===========================*/ #define IMU_ST

_LATD1

/*For General IO RD2 --> mode selection button(input to PIC) RD3 --> mode execution button(input to PIC) RB7 --> green led(ouput from PIC) RB8 --> red led(output from PIC) ===========================*/ #define BT_SELECT #define BT_EXEC #define LED_GREEN #define LED_RED

_RD2 _RD3 _LATB7 _LATB8

///define constant ///define motor control constant #define MC_MAXSPEED #define MC_MINSPEED #define MC_THRESHOLD #define MC_SPEEDSTEP

20000 0 30000 16

61 #define

SCALE_FACTOR

3

///define general constant #define TRUE #define FALSE

0xff 0x00

///define uart constant #define UART_LENGHTBUFFER

64

///define accelerometer constant #define G_VALUE #define TIME_CY

9.80665 0.000000005

///define odometry parameter(in cm) #define ROBOT_LENGHT #define WHEEL_RADIUS #define PI ///define fuzzy constant #define CORRECTION_CONSTANT ///angle error constant #define NL_LOW #define NL_MIDDLE #define NL_HIGH #define NS_LOW #define NS_MIDDLE #define NS_HIGH #define ZE_LOW #define ZE_MIDDLE #define ZE_HIGH #define PS_LOW #define PS_MIDDLE #define PS_HIGH #define PL_LOW #define PL_MIDDLE #define PL_HIGH

23.2f 4.0f 3.141592654f

1000 -90.0f -54.0f -18.0f -20.0f -10.25f -0.5f -1.0f 0.0f 1.0f 0.5f 10.25f 20.0f 18.0f 54.0f 90.0f

//negative large //-60 //-30 //negative small(40) //-20.25 //(-3) //zero(-5)

///distance error constant(in cm) #define VN_BORDER #define VN_HIGH #define LN_LOW #define LN_MIDDLE #define LN_HIGH #define ME_LOW #define ME_MIDDLE #define ME_HIGH #define LF_LOW #define LF_MIDDLE #define LF_HIGH #define VF_LOW #define VF_BORDER

0.1f 0.5f 0.3f 12.65f 25.0f 20.0f 27.5f 35.0f 33.0f 39.0f 45.0f 40.0f 45.0f

//very near(0.5) //(2.0) //little near(0.3) //(12.25) //(25.0) //medium(21) //28

///output constant #define #define #define #define #define #define #define

ST_MIDDLE SF_MIDDLE F_MIDDLE FF_MIDDLE SB_MIDDLE B_MIDDLE FB_MIDDLE

0 500 3000 9000 -500 -3000 -9000

//stop //slow forward //forward //fast forward //slow backward //backward //fast backward

///define macro #define #define

MIN(x, y) MAX(x, y)

//(5) //positive small(3) //20.25 //40 //positive large(30) //60

//little far

//very far

(((x) > (y))?(y): (x)) (((x) > (y))?(x): (y))

//===================================Data Structure Definition============================= union HW_FLAG{ //Define hardware flag for mainboard unsigned char state; struct { unsigned char btSelectFlag : 1; unsigned char btExecFlag : 1; unsigned char ldGreen : 1; unsigned char ldRed : 1;

62 unsigned char reserved

:

4;

: : : : : : : : : :

1; 1; 1; 1; 1; 1; 1; 1; 1; 7;

}; }hwFlag; union MC_FLAG{ unsigned int state; struct { unsigned char updateSpeed unsigned char activeLeft unsigned char activeRight unsigned char changeDir unsigned char idle unsigned char changeLeft unsigned char changeRight unsigned char updateLeft unsigned char updateRight unsigned char reserved }; }mcFlag;

union UART_FLAG{ unsigned char state; struct { unsigned char ProcessError : //occur when rxBuffer is full but not process is happen unsigned char overWriteTx : //occur when write to the txBuffer while inside still have data unsigned char reserved : }; }uartFlag; union IC_FLAG{ unsigned char state; struct { unsigned char channel1Active //occur when input capture channel 1 has capture event unsigned char channel2Active //occur when input capture channel 2 has capture event unsigned char channel7Active //occur when input capture channel 7 has capture event unsigned char channel8Active //occur when input capture channel 8 has capture event unsigned char reserved }; }icFlag; union CLOCK_FLAG{ unsigned char state; struct { unsigned char time5ms unsigned char time1s unsigned char reserved }; }timerFlag;

1; 1; 6;

:

1;

:

1;

:

1;

:

1;

:

4;

: : :

1; 1; 6;

///define type structure typedef struct FUZZY_OUTPUT{ double fuzzyValue; int fuzzySet_left; int fuzzySet_right; }fuzzyOutput; typedef struct ODOMETRY_STATE{ double x_value; double y_value; double angle; }odometryState;

//in cm //in cm //in radian

//===================================Global Variable======================================= volatile unsigned int stepLeftSpeedBuffer = 0; //declare variable in unsigned 16-bit(volatile->change in ISR) to store the left speed value

63 volatile unsigned int stepRightSpeedBuffer = 0; //declare variable in unsigned 16-bit(volatile->change in ISR) to store the right speed value volatile unsigned int tmr1Buffer = 0; //declare variable in unsigned 16-bit(volatile->change in ISR) to store the number of timer overflow volatile double leftStep = 0; //declare variable in unsigned 32-bit to store the angle step of left motor volatile double rightStep = 0; //declare variable in unsigned 32-bit to store the angle step of right motor volatile unsigned char leftDir = 0; //declare variable in unsigned 8-bit to store the left direction volatile unsigned char rightDir = 0; //declare variable in unsigned 8-bit to store the right direction volatile unsigned char txBuffer[UART_LENGHTBUFFER] = {0}; //declare array in unsigned 8-bit to store the value to be transmitted volatile unsigned char rxBuffer[UART_LENGHTBUFFER] = {0}; //declare array in unsigned 8-bit to store the value has been received volatile unsigned char lenghtTXBuffer = 0; //declare variable in unsigned 8-bit to store the number data will be transmitted volatile unsigned char lenghtRXBuffer = 0; //declare variable in unsigned 8-bit to store the number data has been received volatile unsigned char txCounter = 0; //declare variable in unsigned 8-bit to store the current number of transmit buffer volatile unsigned char rxCounter = 0; //declare variable in unsigned 8-bit to store the current number of received buffer volatile unsigned int test = 0; //declare variable in unsigned 16bit to save testing data volatile int test2 = 0; //declare variable in unsigned 8bit to save testing data volatile unsigned char test3[8] = {0}; //declare variable in unsigned 8bit to save testing data volatile double test4[4] = {0}; //declare variable in unsigned 16bit to save testing data volatile unsigned int adcBuf[2] = {0}; //declare variable in unsigned 16bit to store buffer for adc module ///fuzzy controller variable int fuzz_motorLeft = 0; int fuzz_motorRight = 0; ///fuzzy rule base constant(row-> distance, column->angle) const int leftRule[5][5] = { { ST_MIDDLE, ST_MIDDLE, ST_MIDDLE, ST_MIDDLE, ST_MIDDLE}, { FF_MIDDLE, SF_MIDDLE, SF_MIDDLE, ST_MIDDLE, ST_MIDDLE}, { FF_MIDDLE, F_MIDDLE , F_MIDDLE , SF_MIDDLE, SF_MIDDLE}, { FF_MIDDLE, FF_MIDDLE, FF_MIDDLE, SF_MIDDLE, SF_MIDDLE}, { FF_MIDDLE, FF_MIDDLE, FF_MIDDLE, SF_MIDDLE, SF_MIDDLE}}; const int rightRule[5][5] = { { ST_MIDDLE, ST_MIDDLE, ST_MIDDLE, ST_MIDDLE, ST_MIDDLE}, { ST_MIDDLE, ST_MIDDLE, SB_MIDDLE, SB_MIDDLE, SB_MIDDLE}, { SB_MIDDLE, SB_MIDDLE, B_MIDDLE , B_MIDDLE , FB_MIDDLE}, { SB_MIDDLE, SB_MIDDLE, FB_MIDDLE, FB_MIDDLE, FB_MIDDLE}, { SB_MIDDLE, SB_MIDDLE, FB_MIDDLE, FB_MIDDLE, FB_MIDDLE}}; ///odometry variable odometryState odoState = {0.0f, 0.0f, 0.0f}; odometryState odoTarget = {0.0f, 0.0f, 0.0f}; odometryState odoPrevious = {0.0f, 0.0f, 0.0f}; ///path variable char point[40] = {0}; unsigned char limit; //====================================Function Prototype=================================== ///Interrupt function prototype void __attribute__((__interrupt__)) _OscillatorFail(void); void __attribute__((__interrupt__)) _AddressError(void); void __attribute__((__interrupt__)) _StackError(void); void __attribute__((__interrupt__)) _MathError(void); void __attribute__((__interrupt__)) _T1Interrupt(void); void __attribute__((__interrupt__)) _U2TXInterrupt(void); void __attribute__((__interrupt__)) _U2RXInterrupt(void); void __attribute__((__interrupt__)) _T2Interrupt(void); //void __attribute__((__interrupt__)) _PWMInterrupt(void);(Replaced)

64 void __attribute__((__interrupt__)) _T4Interrupt(void); void __attribute__((__interrupt__)) _T5Interrupt(void); void __attribute__((__interrupt__)) _ADCInterrupt(void); ///Subroutine function prototype void Init(void); void ModeIndication(unsigned char *mode); void ExecMode(unsigned char *mode); void delay(unsigned long num); //unsigned long --> 32bit wide void MotorControl(unsigned int leftSpeed, unsigned int rightSpeed, unsigned char leftDir, unsigned char rightDir); void UARTTransmit(unsigned char lenght, ...); unsigned char UARTReceipt(unsigned char lenght, unsigned char array[]); void FuzzyControl_Position(double error, double aError); void Odometry(double dLeftDistance, double dRightDistance); void ControlAngle(double targetAngle); void SetTarget(int x_value, int y_value); void Reset(void); //====================================Main Function======================================== int main(void) { ///Declare local variable unsigned char mode = 0; //declare variable in unsigned 8-bit to store the mode number Init(); while(1) { if(!(BT_SELECT)) { delay(2000); if(!(BT_SELECT) && !(hwFlag.btSelectFlag)) { mode++; hwFlag.btSelectFlag = 1; } } else hwFlag.btSelectFlag = 0; ModeIndication(&mode); if(!(BT_EXEC)) { delay(2000); if(!(BT_EXEC) && !(hwFlag.btExecFlag)) { ExecMode(&mode); hwFlag.btExecFlag = 1; } } else hwFlag.btExecFlag = 0;

//increase mode number //set the select button flag

//clear the select button flag

//set theexecution button flag

//clear the execution button flag

} return 0; } //=====================================Subroutine=========================================== void Init(void) { ///Declare local variable unsigned char i; //declare variable in unsigned 8bit to be counter in for loop unsigned int adcInit[4] = {0}; ///Initialize pin direction TRISB = 0xfe7f; TRISD = 0xffff; TRISC = 0x7fff; TRISE = 0xffc0; TRISF = 0xffdf; ///Indication of start Initialization LED_GREEN = 1; LED_RED = 1; ///Configure Interrupt(disable all interrupt) _IPL = 7;

//Set main routine to priority 7(disable interrupt)

65 INTCON1 = INTCON2 = 0;

//Enable nesting interrupt and use IYT

///Configure ADC(for X filter and Y filter) ADPCFG = 0xffcf; adcInit[0] = (autoSamp | autoConvert | sequence); //auto sampling and convert,all in sequence adcInit[1] = (CH0 | scanInput | freqSampleInterrupt(15)); //interrupt every 16 sampling/convert adcInit[2] = (autoSampTime(31) | convertTime(63)); //sampling time is 31Tad and conversion time is 12.8us adcInit[3] = 0x0000; //using the scan mode adcConfig(adcInit[0], adcInit[1], adcInit[2], adcInit[3], 0xffcf); //scan through AN0,AN1,AN4(ACCELX),AN5(ACCELY),AN6(ACCELZ) ADCSSL = 0b0000000000110000; _ADIP = 5; //set priority to 5 _ADIF = 0; //clear flag bit _ADIE = 1; //adc interrupt enable ///Configure Timer 1(Internal counter -> time source)(RTC) TMR1 = 0; //reset timer 1 counter T1CON = 0; //Stops timer 1 and reset control register PR1 = 0xffff; //set period register to maximum value; _T1IP = 1; _T1IF = 0; _T1IE = 1;

//set priority to 1 //Clear flag bits //timer 1 interrupt enables

///Configure Timer 2(IMU update rate) TMR2 = 0; T2CON = 0; PR2 = 0xffff;

(Input Capture timer source - onboard(previous)) //reset timer 2 counter //Stop timer 2 and reset control register //set period register to maximum value

_T2IP = 4; _T2IF = 0; _T2IE = 1;

//set priority to 1 //clear flag bits //timer 2 interrupt enables

///Configure Timer 4 and 5 to alternate PWM(step output) TMR4 = 0; TMR5 = 0; T4CON = 0x0000; //disable timer4 T5CON = 0x0000; //disable timer5 PR4 = 40000; //Pulse width 1:1->2ms PR5 = 40000; //Pulse width 1:1->2ms _T4IP = 2; _T5IP = 2; _T4IF = 0; _T5IF = 0; _T4IE = 1; _T5IE = 1; ///Configure UART2 for Communication with PC and Xbee U2MODE = 0x8003; //enable UART module and 8bit data, even parity, 2 stop bit U2STA = 0x0400; //enable transmit, interupt at each transmit data byte and each receive data byte U2BRG = 64; //Baud rate of 19200, error(0.2%) _U2TXIP = 4; _U2TXIF = 0; _U2TXIE = 1; _U2RXIP = 4; _U2RXIF = 0; _U2RXIE = 1; ///Configure Interrupt(Enable all interrupt) _IPL = 0;

//Set main routine to priority to 0(enable interrupt)

///Set initial condition ///Initial condition for timer 1 TMR1 = 15536; T1CON |= 0x8000;

//Initial timer value(another 2.5ms to overflow) //start timer 1;

///Initial condition for PWM(Replaced) -> Step output T4CON |= 0x8000; T5CON |= 0x8000; stepLeftSpeedBuffer = 30000; stepRightSpeedBuffer = 30000; LDIR = 0; LPULSE = 0; RDIR = 0; RPULSE = 0;

//start timer 4 //start timer 5 //set initial left speed buffer //set initial right speed buffer

66 ///Initial condition for control Flags hwFlag.state = 0; mcFlag.state = 0; rightDir = FALSE; leftDir = TRUE; uartFlag.state = 0; timerFlag.state = 0; rightStep = 0; leftStep = 0;

//clear all hardware flag //clear all motor control flag //set in forward direction //set in forward direction //clear all uart flag

lenghtTXBuffer = 0; lenghtRXBuffer = 0; txCounter = 0; rxCounter = 0; ///Indication of finish Initialization for(i = 0; i0;num--); } void MotorControl(unsigned int leftSpeed, unsigned int rightSpeed, unsigned char leftDirection, unsigned char rightDirection) { static unsigned char leftDirBuffer = FALSE; static unsigned char rightDirBuffer = FALSE; ///style to Prevent mis-control mcFlag when interrupt happens in the MotorControl function if((stepLeftSpeedBuffer != leftSpeed) && (stepRightSpeedBuffer != rightSpeed))

70 { mcFlag.updateSpeed = 1; mcFlag.updateLeft = 1;mcFlag.updateRight = 1; stepLeftSpeedBuffer = leftSpeed; stepRightSpeedBuffer = rightSpeed; } else if((stepLeftSpeedBuffer == leftSpeed) && (stepRightSpeedBuffer != rightSpeed)) { mcFlag.updateSpeed = 1; mcFlag.updateRight = 1; stepRightSpeedBuffer = rightSpeed; } else if((stepLeftSpeedBuffer != leftSpeed) && (stepRightSpeedBuffer == rightSpeed)) { mcFlag.updateSpeed = 1; mcFlag.updateLeft = 1; stepLeftSpeedBuffer = leftSpeed; } if((leftDirection != leftDirBuffer) && (rightDirection != rightDirBuffer)) { mcFlag.changeDir = 1; mcFlag.changeLeft = 1; mcFlag.changeRight = 1; } else if((leftDirection == leftDirBuffer) && (rightDirection != rightDirBuffer)) { mcFlag.changeDir = 1; mcFlag.changeRight = 1; } else if((leftDirection != leftDirBuffer) && (rightDirection == rightDirBuffer)) { mcFlag.changeDir = 1; mcFlag.changeLeft = 1; } leftDirBuffer = leftDirection; rightDirBuffer = rightDirection; } void UARTTransmit(unsigned char lenght, ...) { ///declare local variable unsigned char i; va_list arg;

//decalre variable in unsigned 8-bit counter value //declare variable in va_list type defined in stdarg.h

if(lenghtTXBuffer != txCounter) //if last transmission haven't complete { uartFlag.overWriteTx = 1; return; } va_start(arg, lenght); //initialize arg(type of va_list), 2nd parameter is the identifier of start of argument list for(i = 0; i< lenght; i++) txBuffer[i] = va_arg(arg, unsigned char);

//extract argument list and store into transmit buffer

lenghtTXBuffer = lenght; if(!(U2STAbits.UTXBF)) { U2TXREG = txBuffer[0]; txCounter = 1; } else { txCounter = 0; } va_end(arg); } unsigned char UARTReceipt(unsigned char lenght, unsigned char array[]) { unsigned char i; if(lenghtRXBuffer >= lenght) { for(i = 0; i LN_LOW) && (error < LN_MIDDLE)) { fError[counter] = (error - LN_LOW) / (LN_MIDDLE - LN_LOW); colError[counter] = 1; counter++; } else if((error > LN_MIDDLE) && (error < LN_HIGH)) { fError[counter] = 1.0f - (error - LN_MIDDLE) / (LN_HIGH - LN_MIDDLE); colError[counter] = 1; counter++; } if(error == ME_MIDDLE) { fError[counter] = 1.0f; colError[counter] = 2; counter++; } else if((error > ME_LOW) && (error < ME_MIDDLE)) { fError[counter] = (error - ME_LOW) / (ME_MIDDLE - ME_LOW); colError[counter] = 2; counter++; } else if((error > ME_MIDDLE) && (error < ME_HIGH)) { fError[counter] = 1.0f - (error - ME_MIDDLE) / (ME_HIGH - ME_MIDDLE); colError[counter] = 2; counter++; } if(error == LF_MIDDLE) { fError[counter] = 1.0f; colError[counter] = 3; counter++; } else if((error > LF_LOW) && (error < LF_MIDDLE)) { fError[counter] = (error - LF_LOW) / (LF_MIDDLE - LF_LOW); colError[counter] = 3; counter++; } else if((error > LF_MIDDLE) && (error < LF_HIGH)) { fError[counter] = 1.0f - (error - LF_MIDDLE) / (LF_HIGH - LF_MIDDLE); colError[counter] = 3; counter++; } if(error >= VF_BORDER) { fError[counter] = 1.0f; colError[counter] = 4; counter++; else if((error > VF_LOW) && (error < VF_BORDER))

}

72 {

fError[counter] = (error - VF_LOW) / (VF_BORDER - VF_LOW); colError[counter] = 4; counter++;

} counter = 0; if(aError == NL_MIDDLE) { fAngle[counter] = 1.0f; colAngle[counter] = 0; counter++; } else if((aError > NL_LOW) && (aError < NL_MIDDLE)) { fAngle[counter] = (aError - NL_LOW) / (NL_MIDDLE - NL_LOW); colAngle[counter] = 0; counter++; } else if((aError > NL_MIDDLE) && (aError < NL_HIGH)) { fAngle[counter] = 1.0f - (aError - NL_MIDDLE) / (NL_HIGH - NL_MIDDLE); colAngle[counter] = 0; counter++; } if(aError == NS_MIDDLE) { fAngle[counter] = 1.0f;colAngle[counter] = 1; counter++; } else if((aError > NS_LOW) && (aError < NS_MIDDLE)) { fAngle[counter] = (aError - NS_LOW) / (NS_MIDDLE - NS_LOW); colAngle[counter] = 1; counter++; } else if((aError > NS_MIDDLE) && (aError < NS_HIGH)) { fAngle[counter] = 1.0f - (aError - NS_MIDDLE) / (NS_HIGH - NS_MIDDLE); colAngle[counter] = 1; counter++; } if(aError == ZE_MIDDLE) { fAngle[counter] = 1.0f; colAngle[counter] = 2; counter++; } else if((aError > ZE_LOW) && (aError < ZE_MIDDLE)) { fAngle[counter] = (aError - ZE_LOW) / (ZE_MIDDLE - ZE_LOW); colAngle[counter] = 2; counter++; } else if((aError > ZE_MIDDLE) && (aError < ZE_HIGH)) { fAngle[counter] = 1.0f - (aError - ZE_MIDDLE) / (ZE_HIGH - ZE_MIDDLE); colAngle[counter] = 2; counter++; } if(aError == PS_MIDDLE) { fAngle[counter] = 1.0f; colAngle[counter] = 3; counter++; } else if((aError > PS_LOW) && (aError < PS_MIDDLE)) { fAngle[counter] = (aError - PS_LOW) / (PS_MIDDLE - PS_LOW); colAngle[counter] = 3; counter++; } else if((aError > PS_MIDDLE) && (aError < PS_HIGH)) { fAngle[counter] = 1.0f - (aError - PS_MIDDLE) /(PS_HIGH - PS_MIDDLE); colAngle[counter] = 3; counter++; } if(aError == PL_MIDDLE) { fAngle[counter] = 1.0f; colAngle[counter] = 4; counter++; } else if((aError > PL_LOW) && (aError < PL_MIDDLE)) { fAngle[counter] = (aError - PL_LOW) / (PL_MIDDLE - PL_LOW); colAngle[counter] = 4; counter++; } else if((aError > PL_MIDDLE) && (aError < PL_HIGH)) { fAngle[counter] = 1.0f - (aError - PL_MIDDLE) /(PL_HIGH - PL_MIDDLE); colAngle[counter] = 4; counter++; } ///rule base k = 0; for(i = 0; i 20000)?20000: fuzz_motorLeft; fuzz_motorRight = (fuzz_motorRight < -20000)?-20000: fuzz_motorRight; } void ControlAngle(double targetAngle) { double error = 0.0f; double compensate = 0.0f; double dLeftDistance = 0.0f; double dRightDistance = 0.0f; delay(100000); stepLeftSpeedBuffer = 30000; //set initial left speed buffer stepRightSpeedBuffer = 30000; //set initial right speed buffer do { dLeftDistance = (leftStep * 0.017453292f * WHEEL_RADIUS); dRightDistance = (rightStep * 0.017453292f * WHEEL_RADIUS); leftStep = 0.0f; rightStep = 0.0f; Odometry(dLeftDistance, dRightDistance); error = targetAngle - odoState.angle; leftDir = (error > 0)? FALSE : TRUE; rightDir = (error > 0)? FALSE : TRUE; compensate = error * 500;

//if error positive, left motor is backward //if error positive, right motor is forward

mcFlag.activeLeft = 1; mcFlag.activeRight = 1; MotorControl((unsigned int)(compensate), (unsigned int)(compensate), leftDir, rightDir); }while(fabs(error) > 0.128f); mcFlag.activeLeft = 0; mcFlag.activeRight = 0; stepLeftSpeedBuffer = 30000; //set initial left speed buffer stepRightSpeedBuffer = 30000; //set initial right speed buffer delay(100000); } void Odometry(double dLeftDistance, double dRightDistance) { /********************************************************************************* * fx = x + dDistance * cos(angle + dAngle/2) * fy = y + dDistance * sin(angle + dAngle/2) * fangle = angle + dAngle * dDistance = (dLeftDistance + dRightDistance) / 2 * dAngle = (dRightDistance - dLeftDistance) / lenght; **********************************************************************************/ double dAngle = 0.0f; double dDistance = 0.0f; dDistance = (dLeftDistance + dRightDistance) / 2; dAngle = (dRightDistance - dLeftDistance) / ROBOT_LENGHT; odoState.x_value += dDistance * cos(odoState.angle + (dAngle / 2)); odoState.y_value += dDistance * sin(odoState.angle + (dAngle / 2)); odoState.angle += dAngle; if(odoState.angle > PI) else if(odoState.angle < -PI) } void Reset(void) { Init(); odoState.x_value = 0.0f;

odoState.angle = 2 * PI - odoState.angle; odoState.angle += 2*PI;

74 odoState.y_value = 0.0f; odoState.angle = 0.0f; odoTarget.x_value = 0.0f; odoTarget.y_value = 0.0f; odoTarget.angle = 0.0f; odoPrevious.x_value = 0.0f; odoPrevious.y_value = 0.0f; odoPrevious.angle = 0.0f; } //==========================================Interrupt Function=========================================================== void __attribute__((__interrupt__)) _OscillatorFail(void) { while(1); } void __attribute__((__interrupt__)) _AddressError(void) { while(1); } void __attribute__((__interrupt__)) _StackError(void) { while(1); } void __attribute__((__interrupt__)) _MathError(void) { while(1); } void __attribute__((__interrupt__)) _T1Interrupt(void) { _T1IF = 0; TMR1 = 15536; tmr1Buffer++; test++; if((tmr1Buffer % 2) == 0) if(tmr1Buffer == 400) //for 1s { ///start of other functions _ADON = 1;

//Initial timer value(another 2.5ms to overflow)

timerFlag.time5ms = 1;

//Enable adc module

timerFlag.time1s = 1; tmr1Buffer = 0; } } void __attribute__((__interrupt__)) _U2TXInterrupt(void) { //=============Occur when each character in transmit buffer transfer to TSR======================== _U2TXIF = 0; //Clear transmit flag bit if(txCounter != (lenghtTXBuffer)) { if(!(U2STAbits.UTXBF)) { U2TXREG = txBuffer[txCounter]; txCounter++; } } else { uartFlag.overWriteTx = 0; lenghtTXBuffer = 0; txCounter = 0; } } void __attribute__((__interrupt__)) _U2RXInterrupt(void) { unsigned char discard = 0; unsigned char i;

75

//==================Occur when received each charater data================================= _U2RXIF = 0; //Clear receive flag bit if((U2STAbits.PERR)|| (U2STAbits.FERR))//if receive parity error and framming error { if(U2STAbits.URXDA) discard = U2RXREG; } else { for(i = 0; i PR4 = 20000 else if(stepLeftSpeedBuffer > MC_MAXSPEED) stepLeftSpeedBuffer = MC_MAXSPEED; //Pulse width of 2ms -> PR4 = 40000 stepLeftSpeedBuffer = MC_THRESHOLD - stepLeftSpeedBuffer; mcFlag.updateLeft = 0; if(!(mcFlag.updateLeft) && !(mcFlag.updateRight)) mcFlag.updateSpeed = 0; } ///update speed in process if(PR4 != stepLeftSpeedBuffer && !(mcFlag.changeDir)) { if(stepLeftSpeedBuffer > PR4) { if((stepLeftSpeedBuffer - PR4) >= MC_SPEEDSTEP) PR4 = PR4 + MC_SPEEDSTEP; else PR4 = stepLeftSpeedBuffer; } else { if((PR4 - stepLeftSpeedBuffer) >= MC_SPEEDSTEP) PR4 = PR4 - MC_SPEEDSTEP; else PR4 = stepLeftSpeedBuffer; } } ///idle mode if(mcFlag.idle) { if(mcFlag.changeDir) { PR4 = 40000; if(mcFlag.changeLeft) //set direction mcFlag.changeLeft = 0; } else { mcFlag.activeLeft = 0; mcFlag.idle = 0; }

//Pulse width 2ms(min speed) LDIR = (leftDir & 0x80)? 1:0;

76 if(!(mcFlag.changeLeft) && !(mcFlag.changeRight)) { mcFlag.idle = 0; mcFlag.changeDir = 0; mcFlag.activeLeft = 1; mcFlag.activeRight = 1; } } ///change direction command if(mcFlag.changeDir) { mcFlag.idle = 1; if(mcFlag.changeLeft) PR4 = 0xffff; mcFlag.activeLeft = 0; } if((mcFlag.activeLeft) && ((scale % SCALE_FACTOR) == 0)) { static unsigned char t = 0;

//temporary disable the pulse toggle //set to idle speed

LPULSE ^= 1; t = (t + 1) % 2; if(t == 0) { if(leftDir == FALSE) leftStep -= 1.8; else if(leftDir == TRUE) leftStep += 1.8; } scale = 0; } scale++; } void __attribute__((__interrupt__)) _T5Interrupt(void) { static char scale = 0; _T5IF = 0; ///Update speed in calculation if(mcFlag.updateSpeed && mcFlag.updateRight) { if(stepRightSpeedBuffer < MC_MINSPEED) stepRightSpeedBuffer = MC_MINSPEED; //Pulse width of 1ms -> PR4 = 20000 else if(stepRightSpeedBuffer > MC_MAXSPEED) stepRightSpeedBuffer = MC_MAXSPEED; //Pulse width of 2ms -> PR4 = 40000 stepRightSpeedBuffer = MC_THRESHOLD - stepRightSpeedBuffer; mcFlag.updateRight = 0; if(!(mcFlag.updateLeft) && !(mcFlag.updateRight)) mcFlag.updateSpeed = 0; } ///update speed in process if(PR5 != stepRightSpeedBuffer && !(mcFlag.changeDir)) { if(stepRightSpeedBuffer > PR5) { if((stepRightSpeedBuffer - PR5) >= MC_SPEEDSTEP) PR5 = PR5 + MC_SPEEDSTEP; else PR5 = stepRightSpeedBuffer; } else { if((PR5 - stepRightSpeedBuffer) >= MC_SPEEDSTEP) PR5 = PR5 - MC_SPEEDSTEP; else PR5 = stepRightSpeedBuffer; } } if(mcFlag.idle) { if(mcFlag.changeDir) { PR5 = 40000; //Pulse width 2ms if(mcFlag.changeRight) mcFlag.changeRight = 0;

RDIR = (rightDir & 0x80)? 1:0;

77 } else { mcFlag.activeRight = 0; mcFlag.idle = 0; } if(!(mcFlag.changeLeft) && !(mcFlag.changeRight)) { mcFlag.idle = 0; mcFlag.changeDir = 0; mcFlag.activeRight = 1; mcFlag.activeLeft = 1; } } if(mcFlag.changeDir) { mcFlag.idle = 1; if(mcFlag.changeRight) PR5 = 0xffff; mcFlag.activeRight = 0; } if((mcFlag.activeRight) && ((scale % SCALE_FACTOR) == 0)) { static unsigned char t = 0; RPULSE ^= 1; t = (t + 1) % 2; if(t == 0) { if(rightDir == TRUE) rightStep -= 1.8; else if(rightDir == FALSE) rightStep += 1.8; } scale = 0; } scale++; } void __attribute__((__interrupt__)) _ADCInterrupt(void) { static unsigned int filterBuffer[2][4] = {{0}}; static unsigned char i = 0; unsigned int adcFilterResult[2] = {0}; unsigned char h; unsigned int *ADRES; _ADIF = 0; ADRES = recordADC();

for(h = 0;h