0
Research Papers

Nonlinear Vision-Based Observer for Visual Servo Control of an Aerial Robot in Global Positioning System Denied Environments

[+] Author and Article Information
Dejun Guo

Design, Automation, Robotics,
and Control (DARC) Lab,
Department of Mechanical Engineering,
Robotics Center,
University of Utah,
Salt Lake City, UT 84112
e-mail: dejun.guo@utah.edu

Hesheng Wang

Key Laboratory of System Control and
Information Processing,
Department of Automation,
Shanghai Jiao Tong University,
Shanghai 200240, China
e-mail: wanghesheng@sjtu.edu.cn

Kam K. Leang

Design, Automation, Robotics,
and Control (DARC) Lab,
Department of Mechanical Engineering,
Robotics Center,
University of Utah,
Salt Lake City, UT 84112
e-mail: kam.k.leang@utah.edu

1Corresponding author.

Contributed by the Mechanisms and Robotics Committee of ASME for publication in the JOURNAL OF MECHANISMS AND ROBOTICS. Manuscript received May 7, 2018; final manuscript received August 30, 2018; published online October 18, 2018. Assoc. Editor: David J. Cappelleri.

J. Mechanisms Robotics 10(6), 061018 (Oct 18, 2018) (13 pages) Paper No: JMR-18-1131; doi: 10.1115/1.4041431 History: Received May 07, 2018; Revised August 30, 2018

This paper presents a nonlinear vision-based observer to estimate 3D translational position and velocity of a quadrotor aerial robot for closed-loop, position-based, visual-servo control in global positioning system (GPS)-denied environments. The method allows for motion control in areas where GPS signals are weak or absent, for example, inside of a building. Herein, the robot uses a low-cost on-board camera to observe at least two feature points fixed in the world frame to self-localize for feedback control, without constraints on the altitude of the robot. The nonlinear observer described takes advantage of the geometry of the perspective projection and is designed to update the translational position and velocity in real-time by exploiting visual information and information from an inertial measurement unit. One key advantage of the algorithm is it does not require constraints or assumptions on the altitude and initial estimation errors. Two new controllers based on the backstepping technique that take advantage of the estimator's output are described and implemented for trajectory tracking. The Lyapunov method is used to show asymptotic stability of the closed-loop system. Simulation and experimental results from an indoor environment where GPS localization is not available are presented to demonstrate feasibility and validate the performance of the observer and control system for hovering and tracking a circular trajectory defined in the world frame.

FIGURES IN THIS ARTICLE
<>
Copyright © 2018 by ASME
Your Session has timed out. Please sign back in to continue.

References

Hua, M. D. , Hamel, T. , Morin, P. , and Samson, C. , 2009, “ A Control Approach for Thrust-Propelled Underactuated Vehicles and Its Application to VTOL Drones,” IEEE Trans. Autom. Control, 54(8), pp. 1837–1853. [CrossRef]
Abdessameud, A. , and Tayebi, A. , 2010, “ Global Trajectory Tracking Control of VTOL-UAVs Without Linear Velocity Measurements,” Automatica, 46(6), pp. 1053–1059. [CrossRef]
Mahony, R. , Kumar, V. , and Corke, P. , 2012, “ Multirotor Aerial Vehicles: Modeling, Estimation, and Control of Quadrotor,” IEEE Rob. Autom. Mag., 19(3), pp. 20–32.
Kaiser, M. K. , Gans, N. R. , and Dixon, W. , 2010, “ Vision-Based Estimation for Guidance, Navigation, and Control of an Aerial Vehicle,” IEEE Trans. Aerosp. Electron. Syst., 46(3), pp. 1064–1077. [CrossRef]
Mebarki, R. , Lippiello, V. , and Siciliano, B. , 2017, “ Vision-Based and IMU-Aided Scale Factor-Free Linear Velocity Estimator,” Autom. Rob., 41(4), pp. 903–917.
Grzonka, S. , Grisetti, G. , and Burgard, W. , 2012, “ A Fully Autonomous Indoor Quadrotor,” IEEE Trans. Rob., 28(1), pp. 90–100. [CrossRef]
Altug, E. , Ostrowski, J. P. , and Taylor, C. J. , 2005, “ Control of a Quadrotor Helicopter Using Dual Camera Visual Feedback,” Int. J. Rob. Res., 24(5), pp. 329–342. [CrossRef]
Carrillo, L. R. G. , López, A. E. D. , Lozano, R. , and Pégard, C. , 2012, “ Combining Stereo Vision and Inertial Navigation System for a Quad-Rotor UAV,” J. Intell. Rob. Syst., 65(1–4), p. 373387.
Shen, S. , Mulgaonkar, Y. , Michael, N. , and Kumar, V. , 2013, “ Vision-Based State Estimation and Trajectory Control Towards High-Speed Flight With a Quadrotor,” Robotics: Science and Systems (RSS), Berlin, German, June 26–18 http://www.roboticsproceedings.org/rss09/p32.pdf.
Perez-Grau, F. J. , Ragel, R. , Caballero, F. , Viguria, A. , and Ollero, A. , 2018, “ An Architecture for Robust UAV Navigation in GPS-Denied Areas,” J. Field Rob., 35(1), pp. 121–145. [CrossRef]
Mohta, K. , Sun, K. , Liu, S. , Watterson, M. , Pfrommer, B. , Svacha, J. , Mulgaonkar, Y. , Taylor, C. J. , and Kumar, V. , 2018, “ Experiments in Fast, Autonomous, GPS-Denied Quadrotor Flight,” IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, May 21–26, pp. 7832–7839.
Achtelik, M. , Achtelik, M. , Weiss, S. , and Siegwart, R. , 2011, “ Onboard IMU and Monocular Vision Based Control for MAVs in Unknown In- and Outdoor Environments,” IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, May 9–13, pp. 3056–3063.
Weiss, S. , Achtelik, M. W. , Lynen, S. , Chli, M. , and Siegwart, R. , 2012, “ Real-Time Onboard Visual-Inertial State Estimation and Self-Calibration of MAVs in Unknown Environments,” IEEE International Conference on Robotics and Automation (ICRA), Minneapolis, MN, May 14–18, pp. 957–964.
Zhang, X. , Xian, B. , Zhao, B. , and Zhang, Y. , 2015, “ Autonomous Flight Control of a Nano Quadrotor Helicopter in a GPS-Denied Environment Using On-Board Vision,” IEEE Trans. Ind. Electron., 62(10), pp. 6392–6403. [CrossRef]
Weiss, S. , Achtelik, M. W. , Lynen, S. , Achtelik, M. C. , Kneip, L. , Chli, M. , and Siegwart, R. , 2013, “ Monocular Vision for Long-Term Micro Aerial Vehicle State Estimation: A Compendium,” J. Field Rob., 30(5), pp. 803–831. [CrossRef]
Ryan, T. , and Kim, H. J. , 2016, “ Probabilistic Correspondence in Video Sequences for Efficient State Estimation and Autonomous Flight,” IEEE Trans. Rob., 32(1), pp. 99–112. [CrossRef]
Falanga, D. , Mueggler, E. , Faessler, M. , and Scaramuzza, D. , 2017, “ Aggressive Quadrotor Flight Through Narrow Gaps With Onboard Sensing and Computing Using Active Vision,” IEEE International Conference on Robotics and Automation (ICRA), Singapore, May 29–June 2, pp. 5774–5781.
Kendoul, F. , Fantoni, I. , and Nonami, K. , 2009, “ Optic Flow-Based Vision System for Autonomous 3D Localization and Control of Small Aerial Vehicles,” Rob. Autom. Syst., 57(6–7), pp. 591–602. [CrossRef]
Loianno, G. , Brunner, C. , McGrath, G. , and Kumar, V. , 2017, “ Estimation, Control, and Planning for Aggressive Flight With a Small Quadrotor With a Single Camera and IMU,” IEEE Rob. Autom. Lett., 2(2), pp. 404–411. [CrossRef]
Carrillo, L. R. G. , Dzul, A. , and Lozano, R. , 2012, “ Hovering Quad-Rotor Control: A Comparison of Nonlinear Controllers Using Visual Feedback,” IEEE Trans. Aerosp. Electron. Syst., 48(4), pp. 3159–3170. [CrossRef]
Grabe, V. , Bülthoff, H. H. , and Giordano, P. R. , 2012, “ Robust Optical-Flow Based Self-Motion Estimation for a Quadrotor UAV,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vilamoura Algarve, Portugal, Oct. 7–12, pp. 2153–2159.
Meier, L. , Tanskanen, P. , Heng, L. , Lee, G. H. , Fraundorfer, F. , and Pollefeys, M. , 2012, “ PIXHAWK: A Micro Aerial Vehicle Design for Autonomous Flight Using Onboard Computer Vision,” Autom. Rob., 33(1–2), pp. 21–39.
Zhao, S. , Lin, F. , Peng, K. , Dong, X. , Chen, B. M. , and Lee, T. H. , 2016, “ Vision-Aided Estimation of Attitude, Velocity, and Inertial Measurement Bias for UAV Stabilization,” J. Intell. Rob. Syst., 81(3–4), pp. 531–549. [CrossRef]
Amidi, O. , Kanade, T. , and Fujita, K. , 1999, “ A Visual Odometer for Autonomous Helicopter Flight,” Rob. Auton. Syst., 28(2–3), pp. 185–193. [CrossRef]
Chowdhary, G. , Johnson, E. N. , Magree, D. , Wu, A. , and Shein, A. , 2013, “ GPS-Denied Indoor and Outdoor Monocular Vision Aided Navigation and Control of Unmanned Aircraft,” J. Field Rob., 30(3), pp. 415–438. [CrossRef]
Liang, X. , Wang, H. , Chen, W. , Guo, D. , and Liu, T. , 2015, “ Adaptive Image-Based Trajectory Tracking Control of Wheeled Mobile Robots With an Uncalibrated Fixed Camera,” IEEE Trans. Control Syst. Technol., 23(6), pp. 2266–2282. [CrossRef]
Guo, D. , Wang, H. , Chen, W. , Liu, M. , Xia, Z. , and Leang, K. K. , 2017, “ A Unified Leader-Follower Scheme for Mobile Robots With Uncalibrated On-Board Camera,” IEEE International Conference on Robotics and Automation (ICRA), Singapore, May 29–June 2, pp. 3792–3797.
Wang, H. , Guo, D. , Liang, X. , Chen, W. , Hu, G. , and Leang, K. K. , 2017, “ Adaptive Vision-Based Leader–Follower Formation Control of Mobile Robots,” IEEE Trans. Ind. Electron., 64(4), pp. 2893–2902. [CrossRef]
Guo, D. , Yim, W. , and Leang, K. K. , 2016, “ Adaptive Repetitive Visual-Servo Control of a Low-Flying Unmanned Aerial Vehicle With an Uncalibrated High-Flying Camera,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejon, South Korea, Oct. 9–14, pp. 4258–4265.
Guo, D. , Bourne, J. , Wang, H. , Yim, W. , and Leang, K. K. , 2017, “ Adaptive-Repetitive Visual-Servo Control of Low-Flying Aerial Robots Via Uncalibrated High-Flying Cameras,” J. Nonlinear Sci., 27(4), pp. 1235–1256. [CrossRef]
Wang, H. , Guo, D. , Xu, H. , Chen, W. , Liu, T. , and Leang, K. K. , 2017, “ Eye-in-Hand Tracking Control of a Free-Floating Space Manipulator,” IEEE Trans. Aerosp. Electron. Syst., 53(4), pp. 1855–1865. [CrossRef]
Hamel, T. , and Mahony, R. , 2002, “ Visual Servoing of an Under-Actuated Dynamic Rigid-Body System: An Image-Based Approach,” IEEE Trans. Rob. Autom., 18(2), pp. 187–198. [CrossRef]
Mebarki, R. , Lippiello, V. , and Siciliano, B. , 2015, “ Nonlinear Visual Control of Unmanned Aerial Vehicles in GPS-Denied Environments,” IEEE Trans. Rob., 31(4), pp. 1004–1017. [CrossRef]
Serra, P. , Cunha, R. , Hamel, T. , Cabecinhas, D. , and Silvestre, C. , 2016, “ Landing of a Quadrotor on a Moving Target Using Dynamic Image-Based Visual Servo Control,” IEEE Trans. Rob., 32(6), pp. 1524–1535. [CrossRef]
Thomas, J. , Loianno, G. , Daniilidis, K. , and Kumar, V. , 2016, “ Visual Servoing of Quadrotors for Perching by Hanging From Cylindrical Objects,” IEEE Rob. Autom. Lett, 1(1), pp. 57–64. [CrossRef]
Fink, G. , Xie, H. , Lynch, A. F. , and Jagersand, M. , 2015, “ Experimental Validation of Dynamic Visual Servoing for a Quadrotor Using a Virtual Camera,” International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, June 9–12, pp. 1231–1240.
Bourquardez, O. , Mahony, R. , Guenard, N. , Chaumette, F. , Hamel, T. , and Eck, L. , 2009, “ Image-Based Visual Servo Control of the Translation Kinematics of a Quadrotor Aerial Vehicle,” IEEE Trans. Rob., 25(3), pp. 743–749. [CrossRef]
Liang, X. , Liu, Y. H. , Wang, H. , Chen, W. , Xing, K. , and Liu, T. , 2016, “ Leader-Following Formation Tracking Control of Mobile Robots Without Direct Position Measurements,” IEEE Trans. Autom. Control, 61(12), pp. 4131–4137. [CrossRef]
Li, L. , Liu, Y. H. , Jiang, T. , Wang, K. , and Fang, M. , 2018, “ Adaptive Trajectory Tracking of Nonholonomic Mobile Robots Using Vision-Based Position and Velocity Estimation,” IEEE Trans. Cybern., 48(2), pp. 571–582. [CrossRef] [PubMed]
Kendoul, F. , Nonami, K. , Fantoni, I. , and Lozano, R. , 2009, “ An Adaptive Vision-Based Autopilot for Mini Flying Machines Guidance, Navigation and Control,” Autom. Rob., 27(3), pp. 165–188.
Chaumette, F. , and Hutchinson, S. , 2006, “ Visual Servo Control—I: Basic Approaches,” IEEE Rob. Autom. Mag., 13(4), pp. 82–90. [CrossRef]
Guo, D. , and Leang, K. K. , 2017, “ Position and Linear Velocity Estimation for Position-Based Visual Servo Control of an Aerial Robot in GPS-Denied Environments,” ASME Paper No. DSCC2017-5135.
Chaumette, F. , 1998, “ Potential Problems of Stability and Convergence in Image-Based and Position-Based Visual Servoing,” The Confluence of Vision and Control, Springer, New York, pp. 66–78.
Strasdat, H. , Montiel, J. M. M. , and Davison, A. J. , 2010, “ Real-Time Monocular SLAM: Why Filter?,” IEEE International Conference on Robotics and Automation (ICRA), Anchorage, AK, May 3–8, pp. 2657–2664.
Wagner, D. , and Schmalstieg, D. , 2007, “ Artoolkitplus for Pose Tracking on Mobile Devices,” 12th Computer Vision Winter Workshop (CVWW), Lambrecht, Austria, Feb. 6–8, pp. 139–146 https://data.icg.tugraz.at/~dieter/publications/Schmalstieg_114.pdf.
Kukelova, Z. , Bujnak, M. , and Pajdla, T. , 2010, “ Closed-Form Solutions to Minimal Absolute Pose Problems With Known Vertical Direction,” Asian Conference on Computer Vision, Queenstown (ACCV'10), New Zealand, Nov. 8–12, pp. 216–229.
Martin, P. , and Salaün, E. , 2008, “ An Invariant Observer for Earth-Velocity-Aided Attitude Heading Reference Systems,” IFAC Proc. Volumes, 41(2), pp. 9857–9864. [CrossRef]
Cheviron, T. , Hamel, T. , Mahony, R. , and Baldwin, G. , 2007, “ Robust Nonlinear Fusion of Inertial and Visual Data for Position, Velocity and Attitude Estimation of UAV,” IEEE International Conference on Robotics and Automation (ICRA), Rome, Italy, Apr. 10–14, pp. 2010–2016.
Liu, Y. H. , Wang, H. , Wang, C. , and Lam, K. K. , 2006, “ Uncalibrated Visual Servoing of Robots Using a Depth-Independent Interaction Matrix,” IEEE Trans. Rob., 22(4), pp. 804–817. [CrossRef]
Franklin, G. F. , Powell, J. D. , and Emami-Naeini, A. , 2005, Feedback Control of Dynamic Systems, 5th ed., Vol. 7, Prentice Hall, Upper Saddle River, NJ.
Kanayama, Y. , Kimura, Y. , Miyazaki, F. , and Noguchi, T. , 1990, “A Stable Tracking Control Method for an Autonomous Mobile Robot,” IEEE International Conference on Robotics and Automation (ICRA), Cincinnati, OH, May 13–18, pp. 384–389.

Figures

Grahic Jump Location
Fig. 1

Illustration showing a quadcopter aerial robot with a downward-facing camera. The robot uses the camera to monitor two static feature points, b1 and b2, on the ground surface for vision-based estimation of translational position and velocity of the robot for closed-loop tracking control. The desired trajectory for the translational position and yaw angle is denoted by Pd and ψd, respectively. Relevant coordinate frames are also shown.

Grahic Jump Location
Fig. 2

The control system block diagram showing the visual-servo system consisting of an observer for estimating the 3D translational position and velocity of the robot and a controller that acts on the estimated states. The nonlinear closed-loop observer consists of the observer update law Δ (Eq. (30) and (31)). The perspective projection model is denoted by Λ and given by Eqs. (9), (10), (16), and (17). The control signal can be reference acceleration ar given by Eq. (37) or a˙r+sk(ψ˙E3)ar given by Eq. (42).

Grahic Jump Location
Fig. 3

The experimental setup, showing the robot hovering using the proposed control scheme in an enclosed flight volume, while observing two feature points on the ground. A Vicon motion capture system is used to record the translational position and translational velocity of the robot for evaluation. No GPS was used to localize the robot for closed-loop control. The experimental quadcopter system is designed around a DJI Matrice 100 platform. The robot consists of a custom-designed flight control system which is integrated with an Odroid XU4 single-board computer with wireless communication capabilities.

Grahic Jump Location
Fig. 4

Simulation (a1–a4) and experimental (b1–b4) results for hovering in place: (a1/b1) estimated translational position errors, (a2/b2) translational position error, (a3/b3) estimated velocity errors, and trajectories in the world frame for (a4/b4) top view

Grahic Jump Location
Fig. 5

Simulation (a1–a4) and experimental (b1–b4) results for tracking circular trajectory: (a1/b1) estimated translational position errors, (a2/b2) translational position error, (a3/b3) estimated velocity errors, and trajectories in the world frame for (a4/b4) top view

Grahic Jump Location
Fig. 6

Experimental results showing robot (a) starting on a platform, (b) taking off and hovering for short period, (c) tracking a circular trajectory defined in lab frame, and finally, (d) landing on the platform using visual-servo control. The robot observes two feature points on the ground. A Vicon motion capture system is used to record the translational position and translational velocity of the robot for evaluation. GPS was not used to localize the robot for closed-loop control.

Tables

Errata

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In