0
Research Papers

A Human-Inspired Method for Point-to-Point and Path-Following Navigation of Mobile Robots

[+] Author and Article Information
F. Heidari

Mechanical Engineering Department,
University of Saskatchewan,
57 Campus Drive,
Saskatoon S7N 5A9, Canada

R. Fotouhi

Mem. ASME
Mechanical Engineering Department,
University of Saskatchewan,
57 Campus Drive,
Saskatoon S7N 5A9, Canada
e-mail: reza.fotouhi@usask.ca

1Corresponding author.

Manuscript received January 15, 2014; final manuscript received May 24, 2015; published online July 27, 2015. Assoc. Editor: Andrew P. Murray.

J. Mechanisms Robotics 7(4), 041025 (Jul 27, 2015) (18 pages) Paper No: JMR-14-1028; doi: 10.1115/1.4030775 History: Received January 15, 2014

This paper describes a human-inspired method (HIM) and a fully integrated navigation strategy for a wheeled mobile robot in an outdoor farm setting. The proposed strategy is composed of four main actions: sensor data analysis, obstacle detection, obstacle avoidance, and goal seeking. Using these actions, the navigation approach is capable of autonomous row-detection, row-following, and path planning motion in outdoor settings. In order to drive the robot in off-road terrain, it must detect holes or ground depressions (negative obstacles) that are inherent parts of these environments, in real-time at a safe distance from the robot. Key originalities of the proposed approach are its capability to accurately detect both positive (over ground) and negative obstacles, and accurately identify the end of the rows of bushes (e.g., in a farm) and enter the next row. Experimental evaluations were carried out using a differential wheeled mobile robot in different settings. The robot, used for experiments, utilizes a tilting unit, which carries a laser range finder (LRF) to detect objects, and a real-time kinematics differential global positioning system (RTK-DGPS) unit for localization. Experiments demonstrate that the proposed technique is capable of successfully detecting and following rows (path following) as well as robust navigation of the robot for point-to-point motion control.

Copyright © 2015 by ASME
Your Session has timed out. Please sign back in to continue.

References

Bonin, F. , Ortiz, A. , and Oliver, G. , 2008, “Visual Navigation for Mobile Robots: A Survey,” J. Intell. Rob. Syst. Theory Appl., 53(3), pp. 263–296. [CrossRef]
Kobayashi, Y. , Kurita, E. , and Gouko, M. , 2013, “Integration of Multiple Sensor Spaces With Limited Sensing Range and Redundancy,” Int. J. Rob. Autom., 28(1), pp. 31–41.
Nguyen, D. , Kuhnert, L. , and Kuhnert, K. , 2013, “General Vegetation Detection Using an Integrated Vision System,” Int. J. Rob. Autom., 28(2), pp. 170–179.
Su, L. , Luo, C. , and Zhu, F. , 2009, “Obtaining Obstacle Information by an Omnidirectional Stereo Vision System,” Int. J. Rob. Autom., 24(3), pp. 222–227.
Belforte, G. R. , Deboli, P. , and Piccarolo, P. , 2006, “Robot Design and Testing for Greenhouse Applications,” Biosyst. Eng., 95(3), pp. 309–321. [CrossRef]
Torii, T. , 2000, “Research in Autonomous Agriculture Vehicles in Japan,” Comput. Electron. Agric., 25(1–2), pp. 133–153. [CrossRef]
Astrand, B. , and Baerveldt, A. , 2005, “A Vision Based Row-Following System for Agricultural Field Machinery,” Mechatronics, 15(2), pp. 251–269. [CrossRef]
Hamner, B. , Singh, S. , and Bergerman, M. , 2011, “Improving Orchard Efficiency With Autonomous Utility Vehicles,” American Society of Agricultural and Biological Engineers Annual International Meeting (ASABE), Pittsburgh, PA, June 20–23, Paper No. 1009415, Vol. 6, pp. 4670–4685.
Sim, R. , Elinas, P. , Griffin, M. , Shyr, A. , and Little, J. J. , 2006, “Design and Analysis of a Framework for Realtime Vision-Based SLAM Using Rao-Blackwellised Particle Filters,” 3rd Canadian Conference on Computer and Robot Vision, Quebec, Canada, June 7–9, pp. 1–21.
Sim, R. , and Little, J. J. , 2006, “Autonomous Vision-Based Exploration and Mapping Using Hybrid Maps and Rao-Blackwellised Particle Filters,” IEEE International Conference on Intelligent Robots and Systems (IROS), Beijing, Oct. 9–15, pp. 2082–2089.
Manduchi, R. , Castano, A. , Talukder, A. , and Matthies, L. , 2005, “Obstacle Detection and Terrain Classification for Autonomous Off-Road Navigation,” Autonom. Rob., 18(1), pp. 81–102. [CrossRef]
Boris, S. , Lin, E. , Bagnell, J. , Cole, J. , Vandapel, N. , and Stentz, A. , 2006, “Improving Robot Navigation Through Self-Supervised Online Learning,” J. Field Rob., 23(11–12), pp. 1059–1075.
Wellington, C. , Courville, A. , and Stentz, A. , 2006, “A Generative Model of Terrain for Autonomous Navigation in Vegetation,” Int. J. Rob. Res., 25(12), pp. 1287–1304. [CrossRef]
Hamner, B. , Singh, S. , Roth, S. , and Takahashi, T. , 2008, “An Efficient System for Combined Route Traversal and Collision Avoidance,” Auton. Rob., 24(4), pp. 365–385. [CrossRef]
Peynot, T. , Underwood, J. , and Scheding, S. , 2009, “Towards Reliable Perception for Unmanned Ground Vehicles in Challenging Conditions,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2009), St. Louis, MO, Oct. 10–15, pp. 1170–1176.
Ordonez, C. , Chuy, O. Y. , Collins, E. G. , and Xiuwen, L. , 2011, “Laser-Based Rut Detection and Following System for Autonomous Ground Vehicles,” J. Field Rob., 28(2), pp. 158–179. [CrossRef]
Zou, A. M. , Hou, Z. G. , Fu, S. Y. , Tan, M. , 2006, “Neural Networks for Mobile Robot Navigation: A Survey,” Advances in Neural Networks (Lecture Notes in Computer Science), Vol. 3972, Springer, Berlin, pp. 1218–1226.
Kulkarni, A. , and Tesar, D. , 2010, “Instant Center Based Kinematic Formulation for Planar Wheeled Platforms,” ASME J. Mech. Rob., 2(3), p. 031015. [CrossRef]
Udengaard, M. , and Iagnemma, K. , 2009, “Analysis, Design, and Control of an Omnidirectional Mobile Robot in Rough Terrain,” ASME J. Mech. Des., 131(12), p. 121002. [CrossRef]
Chakraborty, N. , and Ghosal, A. , 2005, “Dynamic Modeling and Simulation of a Wheeled Mobile Robot for Traversing Uneven Terrain Without Slip,” ASME J. Mech. Des., 127(5), pp. 901–909. [CrossRef]
Witus, G. , Karlsen, R. , Gorsich, D. , and Gerhart, G. , 2001, “Preliminary Investigation Into the Use of Stereo Illumination to Enhance Mobile Robot Terrain Perception,” Proc. SPIE, 4364, pp. 290–301.
Matthies, L. , 2003, “Negative Obstacle Detection by Thermal Signature,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003), Las Vegas, NV, Oct. 27–31, Vol. 1, pp. 906–913.
Youngshik, K. , and Minor, M. A. , 2007, “Path Manifold-Based Kinematic Control of Wheeled Mobile Robots Considering Physical Constraints,” Int. J. Rob. Res., 26(9), pp. 955–975. [CrossRef]
Flickinger, D. M. , and Minor, M. A. , 2007, “Remote Low Frequency State Feedback Kinematic Motion Control for Mobile Robot Trajectory Tracking,” IEEE International Conference on Robotics and Automation (ICRA), Rome, Apr. 10–14, pp. 3502–3507.
Barawid, O. , Mizushima, A. , Ishii, K. , and Noguchi, N. , 2007, “Development of an Autonomous Navigation System Using a Two-Dimensional Laser Scanner in an Orchard Application,” Biosyst. Eng., 96(2), pp. 139–149. [CrossRef]
Heidari, F. , and Fotouhi, R. , 2013, “A Human-Inspired Method for Mobile Robot Navigation,” ASME Paper No. DETC2013-13523.
Amoozgar, M. , Sadati, H. , and Alipour, K. , 2012, “Trajectory Tracking of Wheeled Mobile Robots Using a Kinematical Fuzzy Controller,” Int. J. Rob. Autom., 27(1), pp. 49–59.
Suo, T. , Yang, S. , and Anmin, Z. , 2011, “A Novel GA-Based Fuzzy Controller for Mobile Robots in Dynamic Environments With Moving Obstacles,” Int. J. Rob. Autom., 26(2), pp. 212–228.
Payne, V. , and Isaacs, L. , 2012, Human Motor Development: A Lifespan Approach, 8th ed., McGraw-Hill, New York.
Fajen, B. R. , and Warren, W. H. , 2003, “Behavioral Dynamics of Steering, Obstacle Avoidance, and Route Selection,” J. Exp. Psychol., 29(2), pp. 343–362.
Frank, T. D. , Gifford, T. D. , and Chiangga, S. , 2014, “Minimalistic Model for Navigation of Mobile Robots Around Obstacles Based on Complex-Number Calculus and Inspired by Human Navigation Behavior,” Math. Comput. Simul., 97, pp. 108–122. [CrossRef]
Huang, W. H. , Fajen, B. R. , Fink, J. R. , and Warren, W. H. , 2006, “Visual Navigation and Obstacle Avoidance Using a Steering Potential Function,” Rob. Auton. Syst., 54(4), pp. 288–299. [CrossRef]
Campus Farm Field, University of Saskatchewan, Saskatchewan, Canada.
CNH Farm Field North of Saskatoon, Saskatchewan, Canada.

Figures

Grahic Jump Location
Fig. 1

The structure of the robot navigation algorithm

Grahic Jump Location
Fig. 2

Geometric configuration of the mobile robot

Grahic Jump Location
Fig. 3

General model of the robot for path following

Grahic Jump Location
Fig. 4

Polar (r–θ) representation of a line

Grahic Jump Location
Fig. 5

(a) A group of lines passing through a point (x0, y0); (b) each line can be represented by a pair of (ri, θi) that becomes a sinusoidal curve at the ri–θi plane

Grahic Jump Location
Fig. 6

(a) Collinear points with normal parameterization of (r0, θ0). (b) Collinear points are transformed into curves that intersect in a single point in the r–θ plane.

Grahic Jump Location
Fig. 7

(a) Original laser point clouds. (b) Hough transform of the points. (c) Line detection using the Hough transform algorithm. Detected lines are shown in light blue (lighter gray in print).

Grahic Jump Location
Fig. 8

Row of bushes detected by the LRF, and lines (1) and (3) generated by the Hough transform

Grahic Jump Location
Fig. 10

2D point clouds acquired by the LRF in two example scenes: (Left) titling angle θ = 10 deg; (Right) the angle θ = −15 deg. The point clouds are analyzed for path planning and obstacle avoidance algorithms.

Grahic Jump Location
Fig. 11

Spherical mapping for laser point clouds

Grahic Jump Location
Fig. 12

(a) A sample scene. (b) Section A-A view of the obstacle map generated using the laser data (point cloud) for scene in (a).

Grahic Jump Location
Fig. 13

A local map of the environment generated using laser data from Fig. 12

Grahic Jump Location
Fig. 14

Terrain slope definition

Grahic Jump Location
Fig. 15

Estimating the normal vector at point P

Grahic Jump Location
Fig. 16

A sample of the traversable region modeling from the laser scanner data: traversable paths are depicted as dashed lines. The width of the AGV is also shown along these paths. The axes are in meters.

Grahic Jump Location
Fig. 17

Fuzzy membership functions for measured distance

Grahic Jump Location
Fig. 18

Fuzzy membership functions for change in the robot heading angle

Grahic Jump Location
Fig. 19

Obstacle avoidance behavior of HIM

Grahic Jump Location
Fig. 20

Sample 2D laser scanner data from the environment. Dark dots are obstacles detected by the laser scanner.

Grahic Jump Location
Fig. 21

Different regions in the laser scanner view

Grahic Jump Location
Fig. 22

Scape-points are defined as Si. Li is the distance from the scape-point Si to the robot position.

Grahic Jump Location
Fig. 23

Two typical performance characteristics of the HIM for mobile robot navigation

Grahic Jump Location
Fig. 24

Experimental outline: 4 × 4 differential drive Grizzly mobile robot (AGV), tilting LRF for obstacle detection, and a base RTK-DGPS for localization

Grahic Jump Location
Fig. 25

Snapshots of the robot for a typical experiment: the robot is traveling from the “start point” to the “goal point,” using the HIM for obstacle avoidance. Left: robot at start point, middle: robot at midpoint, and right: robot at the goal point in a campus farm field [33].

Grahic Jump Location
Fig. 26

Experimental results for validating navigation strategy for eight different setups (a)–(h). Solid line: robot’s path using the HIM; dashed line: robot’s path using the FLB approach.

Grahic Jump Location
Fig. 27

Snapshots of the robot run for the experiment in setup 8: the robot is traversing from the start point to the goal point, using the HIM for obstacle avoidance. Top-left: the robot at start point, and bottom-right: the robot at the goal. Image sequence proceeds to the right and down. Tests were performed in Ref. [33].

Grahic Jump Location
Fig. 28

Lines corresponding to the rows detected by the navigation method using the Hough transform, tested in Ref. [33]

Grahic Jump Location
Fig. 29

A typical experimental result for row-detection and the path following scenario: desired and actual paths of the robot are depicted by dashed line and solid lines; bushes are shown by stars

Grahic Jump Location
Fig. 30

Experimental results obtained for scenario 1 (path following test)

Grahic Jump Location
Fig. 31

Experimental results obtained for scenario 2 (path following test)

Grahic Jump Location
Fig. 32

Experimental results obtained for scenario 3 (path following test)

Grahic Jump Location
Fig. 33

Experimental results obtained for scenario 4 (path following test)

Grahic Jump Location
Fig. 34

Snapshots of the robot following a path on a hill in location [34]. Top-left: the robot at the start point, and bottom-right: the robot at the end-point. Image sequence proceeds to the right and down.

Grahic Jump Location
Fig. 35

Experimental results obtained for scenario 5 (path following test)

Grahic Jump Location
Fig. 36

Snapshots of the robot following a path while avoiding obstacles, tested in [33]

Grahic Jump Location
Fig. 37

Experimental results for the path following scenario in the presence of both positive and negative obstacles on the way of the robot. The size of positive obstacle was 50 × 50 × 40 cm (length × width × height), and the hole depth and diameter was 50 cm.

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In