Vanishing Point Estimation for On-Road Navigation

International Journal of Emerging Technology and Advanced Engineering Website: www.ijetae.com (ISSN 2250-2459, ISO 9001:2008 Certified Journal, Volume...
Author: Diana Gordon
1 downloads 2 Views 338KB Size
International Journal of Emerging Technology and Advanced Engineering Website: www.ijetae.com (ISSN 2250-2459, ISO 9001:2008 Certified Journal, Volume 5, Issue 4, April 2015)

Vanishing Point Estimation for On-Road Navigation Kamini Sabu1, Dr. Prof. M. H. Nerkar2 E&TC Department, Government College of Engineering, Jalgaon, India Abstract— Autonomous navigation of robot for on-road driving has gained growing importance in automobile research area. Image based path tracking is being considered for future driving assistance. Image from a monocular camera directing in front side is used for tracking the drivable road area and road direction. Vanishing Point estimation has been considered very important in case of driver assistance systems to indicate the possible turning direction of road ahead. This paper describes different methods used for vanishing point estimation in a captured image frame. As the road driving is real time operation, timing considerations play a vital role along with the accuracy. With this constraint in mind, an algorithm has been proposed for texture based vanishing point detection.

This phenomenon is used in vanishing point estimation based vision methods for road tracking in driver assistance systems as well as automated vehicles. The vanishing point detection based approach assumes no prior knowledge about the environment and can cope with complex environmental situations like ground colour variations, different illuminations, and cast shadows. This paper considers vanishing point methods for vision based navigation. The paper has been divided into five main sections. Section 2 briefly introduces the concept of vanishing point and technique for its estimation. Section 3 makes detailed analysis of different methods used for estimating the vanishing point. Section 4 proposes a novel algorithm for determining the vanishing point, while Section 5 enlists performance merits and demerits of the novel algorithm.

Keywords— Autonomous Vehicles, Path Tracking, Robot Navigation, Vanishing Point Tracking, Drive Assistance

I. INTRODUCTION II. V ANISHING P OINT

In today’s automation era, maximum efforts are being taken on making robot a completely autonomous machine. ‘Vehicles’ is one of the major areas being considered for automation. Vehicles with driver assistance systems and completely automated vehicles are some major topics under development. Various detection and ranging based methods like RADAR and LIDAR are being tested in competitions and even on roads. Viewing the disadvantages and limitations of these methods, vision based methods are being proposed and are under research. Vision based systems use camera facing the road. Images of the road are captured continuously. The images are then processed to get the information about road suitable for driving and also the direction of road. Different methods used for navigation and road area estimation involve hough transform based edge detection, colour feature based image segmentation, modelling based image classification, etc. All these methods are suitable for well-structured roads only and fail to locate the road direction and area if the roads are ill-structured or unstructured. Vanishing point is a global feature of the road images and hence assists direction estimation of unstructured roads too. Normally, movement of vanishing points with time can clearly state the direction of road. Therefore, drivers, while driving, are normally found to keep eyes on the vanishing point to determine the speed of their vehicle.

Vanishing point is a point at which receding parallel lines seem to meet when represented in linear perspective, or a point at which something disappears or ceases to exist. When image is captured for three-dimensional scenes set of parallel lines in the space appear to have intersecting projections. The point in the picture plane where these lines appear to intersect is called the vanishing point. It is, thus, the spot on the horizon line to which the receding parallel lines diminish. A typical road picture is shown in fig.1 with red coloured circle indicating the vanishing point. Vanishing Point indicates the end point of road in sight. Hence, it is helpful in determining road direction, and hence the direction of movement for vehicle. Vanishing Point Detection Techniques normally involve two main stages – 1) Estimating Vanishing Point Candidates In this stage, all the points that are likely to be vanishing points are estimated. Edges or texture orientation are considered to be the basis for determining the vanishing point. Accordingly, there are two methods of determining vanishing point – a) Edge based, and b) Texture orientation based.

73

International Journal of Emerging Technology and Advanced Engineering Website: www.ijetae.com (ISSN 2250-2459, ISO 9001:2008 Certified Journal, Volume 5, Issue 4, April 2015) A bank of Gabor wavelet filters at equally spaced angles is also used [10]. b) The orientation giving the maximum response is chosen as the Dominant Orientation. c) A ray is drawn starting from pixel p and directed upwards on the image at angle θ(p), where θ(p) is dominant orientation of pixel p. All the points along this line constitute the candidates for Vanishing Point. [10] 2) Vanishing Point Voting: a) Voting Region is decided. Only the image pixels within this region are allowed to vote. This region is decided based on pixel position with respect to vanishing line. Pixels above the current candidate are not allowed to vote. [10] b) Objective Function Voting is used. The difference between angle made by line joining image pixel and vanishing point candidate and the dominant orientation at that pixel is calculated. The voting value is true if the angle is greater than some predefined threshold [10]. c) The point getting maximum votes (true vote value) is considered to be the Vanishing Point. d) Exhaustive or randomized search may be applied for the global maximum of votes.

Fig.1 Vanishing Point shown with red colured circle

2) Vanishing Point Voting In this stage, a voting region is decided. Different points from the voting region are made to vote for the vanishing point. The candidate getting maximum number of votes is considered to be the vanishing point. A. Edge Based Method In the edge based method of Vanishing Point Determination, the algorithms for the two stages can be stated as – 1) Estimating Vanishing Point Candidates: a) Edge detection methods are applied on the image. Canny Edge Detector is popularly used. b) Straight line segments are obtained using Hough Transform. c) Another Hough transform is applied to determine the points of intersection for all these line segments. These intersection points constitute the candidates for Vanishing Point.

III. VANISHING P OINT E STIMATION Road detection based on vanishing point estimation involves road direction estimation as well as road boundary estimation. Both the applications are real-time and hence involve time-criticality. As referred earlier in section 2, two methods are normally used for vanishing point detection. Of these, edge based method is well suited for structured roads. This method gets largely affected if traffic is present on the road. The presence of vehicles and pedestrians on the road adds extra interfering edges. The intersection points then lie in vicinity with such obstacles leading to detection of false vanishing point. For ill-structured or unstructured roads and for the roads with heavy traffic, texture based approach gives more accurate results. However, the texture based methods use large number of Gabor filters with equally spaced orientations ([10] has used 72 filters). This increases computational complexity and time requirement. Therefore, in order to get accuracy in real time, both the stages in vanishing point estimation need to be modified a bit. [6] and [8] suggested and implemented some modifications.

2) Vanishing Point Voting: a) While determining the intersection points, number of lines intersecting at that particular point is also counted. b) Now, the point where maximum number of line segments intersect is considered to be the Vanishing Point. [10] This technique is also called as Cascaded Hough Transform [11] because Hough transform is applied twice. B. Texture Orientation Based Method In the texture orientation based method of Vanishing Point Determination, the algorithms are stated as – 1) Estimating Vanishing Point Candidates: a) Local texture orientation is computed at every image pixel. Multi-scale oriented filters like steerable filters are applied and maximum value responses are analysed. Another approach utilizes principal component analysis of Gaussian pyramid gradients.

A. Improved Vanishing Point Detection The first stage in texture orientation based method uses Gabor filters with equally spaced orientations.

74

International Journal of Emerging Technology and Advanced Engineering Website: www.ijetae.com (ISSN 2250-2459, ISO 9001:2008 Certified Journal, Volume 5, Issue 4, April 2015) Oy = E1sin0+E2sin π/4+E3sinπ/2+E4sin3π/4 ... (2) O(p) = Ox + jOy … (3) 3) Therefore, the orientation at the pixel p is given by, theta = tan-1(Oy/Ox) ... (4) As all the four orientations are considered for calculations, the time required is more than in [8]. This extra time requirement is nullified by time saving at voting stage. The algorithm for voting is much the way same as used in [8]. The only difference is that a disc shaped voting region above the pixel under consideration is considered as the source of vanishing point candidates. As shown in fig.2, in the complete image, the pixel p (i, j) can vote for only the shaded disc shaped region of radius D. The algorithm can thus be stated as – 1) Define a ray rp = (p, θ(p)) within its corresponding disc shaped voting region, for each orientation θ(p). 2) Weigh each θ(p) by its sine function sin(θ(p)). 3) Multiply each ray with a distance function,

If we can minimize the number of Gabor filters used, the time required will automatically get reduced speeding up the procedure. [8] proposed the use of just four Gabor filters with orientations 0, π/4, π/2 and 3π/4. Gabor energy joint responses are then considered by suppressing orthogonal components [8] or best-worst combination [9]. [6] proposed the use of locally adaptive soft-voting scheme. Hard voting scheme used in [10] tends to locate the vanishing point high in the image. The criterion for eligible voters is just the angular distance. The soft voting scheme, on the other hand, also considers the physical distance between the vanishing point candidate and the voter. Advantages of the locally adaptive soft voting scheme discussed in [6] are that voting score is far less than one and hence the support for false vanishing points gets suppressed. However, vanishing point candidates are considered just within top 90% of the image, this being a realistic assumption. Also, only a small group of pixels around the vanishing point candidate is considered as voter for that pixel. [8] used a completely different system for voting which includes voting based on weighting for the orientation. No vote is provided for horizontal lines, while maximum vote is given to vertical lines. Each weighted ray is multiplied by the distance function giving higher vote to points near the voter. To improve the speed performance, the image may first be down sampled to get much lower number of candidates, and voters as well, without affecting the accuracy that much [6, 8]. During this process, the sky pixels are always the offroad pixels and can be put out of computation in order to reduce time requirement and computation complexity. For this, [8] uses color information as feature space and skyline (a hypothesized horizontal line separating the two dissimilar regions-ground and sky) is detected. All the pixels above skyline are excluded from computations.

( )

IV. P ROPOSED ALGORITHM The proposed algorithm uses many of the improvement considerations in [6] and [8]. Only four gabor filters are used as suggested in [8]. However, instead of suppressing two of them, all the four orientation responses are considered for calculating dominant orientation at a pixel. Now, the local dominant orientation at each pixel p(x, y) is calculated as, 1) E1, E2, E3 and E4 are the Gabor energy responses for orientations 0, π/4, π/2 and 3π/4. 2) The complex response in the direction of effective orientation can then be calculated as, Ox = E1cos0+E2cos π/4+E3cosπ/2+E4cos3π/4 ... (1)

where d’ = d / D is the normalized distance. d = distance between the pixel p and the current vanishing point candidate on its corresponding ray, and D is the normalization parameter. 4) Form a voting image VP based on the votes at each pixel. 5) Select a pixel maximum vote quantity in the voting image as the vanishing point or the corresponding image frame. After considering different values of D, it was found that D=50 gives the best results for 320x240 image. At this point, voting region disc radius is such that the physical distance weightage nullifies the vote outside the region. We can thus neglect these entries completely, while keeping attention to the pixels inside the disc region. This reduces the time required for voting process.

Fig.2 Disc voting region

75

… (5)

International Journal of Emerging Technology and Advanced Engineering Website: www.ijetae.com (ISSN 2250-2459, ISO 9001:2008 Certified Journal, Volume 5, Issue 4, April 2015) The second column shows the results for Kong method. The vanishing point is denoted by green dot. This method gives good results for unstructured roads. Even curvature at the road has less impact on the vanishing point estimation. However, the time taken for overall computation is very large. The third column is the results of Peyman method. It proves to be the optimum method in both accuracy and time criticality. In case of heavy traffic, however, this method too suffers. The new approach dealt in this paper proves efficient in case of heavy traffic roads. It, however, fails in case of curved roads. As traffic is the normal scenario in driving, the proposed algorithm proves to be efficient in road direction estimation on road.

V. EXPERIMENTAL RESULTS The algorithms were coded in MATLAB R2007a. The computer used is HCL ezeebee PC model with AMD Sempron 2800+ processor @1.60GHz loaded with operating system Windows XP Professional Service Pack 2. Database of images with size 360x540 was used for checking the experimental results. The images were first resized to 320x240. The results obtained for different algorithms are shown in fig.3. If the image size is changed, some impact has been observed on the results obtained. The parameter D (normalization distance) needs to be selected according to the image size and resolution of the camera under consideration. For the results in fig. 3, the value of D has been considered to be 50. Table 1 summarizes the details of time required for the computations for each method. The first column shows two figures, where first figure shows hough segments and second figure shows the vanishing point in red coloured dots. One can see that due to voting ambiguity, two or more points get same vote and are considered as the vanishing point. Deciding the exact point is difficult. This method proves very good if the road is well structured with no traffic on it. However, this is the ideal condition which is not possible in practical life.

VI. CONCLUSIONS In this paper, we propose a new approach that helps determine vanishing point for road tracking in less time and better accuracy for heavy traffic roads.

Fig.3 Results for various types of Road conditions with different methods

The algorithm will provide better estimation of road direction and will be successful in guiding robot more accurately in crowded environment. The algorithm, however suffers from lack of accuracy in case of curved roads. This aspect forms the future scope for the subject.

Acknowledgment The authors acknowledge the support of Government College of Engineering, Jalgaon and the faculty there of.

76

International Journal of Emerging Technology and Advanced Engineering Website: www.ijetae.com (ISSN 2250-2459, ISO 9001:2008 Certified Journal, Volume 5, Issue 4, April 2015) Kong H., Audibert J. and Ponce J., ‘General Road Detection from a Single Image’, IEEE Transactions on Image Processing, vol. 19(8), pp 2211-2220, 2010 [7] Lu X., ‘A New Vanishing Point Detection from a Single Image’, Proceedings of IEEE International Conference on Audio and Speech Signal Processing, pp 901-904, 2012 [8] Moghadam P. and Dong J., ‘Road Direction Detection Based on Vanishing-Point Tracking’, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp 1553-1560, 2012 [9] Moghadam P., Starzyk J., Wijesoma W., ‘Fast Vanishing-Point Detection in Unstructured Environments’, IEEE Transactions on Image Processing, vol. 21(1), pp 425-430, 2012 [10] Rasmussen C., ‘Grouping Dominant Orientations for IllStructured Road Following’, Proceedings of IEEE Computer Vision and Pattern Recognition, pp 470-477, 2004 [11] Tuytelaars T., Proesmans M., and Gool L., ‘The cascaded Hough transform’, Proceedings of IEEE International Conference on Image Processing, 1998 [6]

REFERENCES [1]

[2]

[3] [4] [5]

Alvarez J., Gevers T. and Lopez A., ‘Vision-Based Road Detection using Road Models’, Proceedings of IEEE International Conference on Image Processing, pp 2073-2076, 2009 Alvarez J. and Lopez A., ‘Novel index for objective evaluation of road detection algorithms’, Proceedings of IEEE Conference on Intelligent Transportation Systems, pp 815–820, 2008 Gonzalez R., Woods R. and Eddins S., ‘Digital Image Processing using MATLAB’, Pearson Education, South Asia, 2009 Gonzalez R. and Woods R., ‘Digital Image Processing’, Pearson Education, South Asia, 2007 Fritsch J., K¨uhnl T. and Geiger A., ‘A New Performance Measure and Evaluation Benchmark for Road Detection Algorithms’, Proceedings of IEEE International Conference on Intelligent Transportation Systems, pp 1693-1700, 2013

77