Navigational Context Recognition for an Autonomous Robot in a Simulated Tree Plantation

Thumbnail Image
Date
2006-01-01
Authors
Khot, Lav
Tang, Lie
Blackmore, Simon
Nørremark, Michael
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Authors
Person
Tang, Lie
Professor
Research Projects
Organizational Units
Journal Issue
Is Version Of
Versions
Series
Department
Agricultural and Biosystems Engineering
Abstract

A sensor fusion technique was developed for estimating the navigational posture of a skid-steered mobile robot in a simulated tree plantation nursery. Real-time kinematic GPS (RTK-GPS) and dynamic measurement unit (DMU) sensors were used to determine the position and orientation of the robot, while a laser range finder was used to locate the tree positions within a selected range. The RTK-GPS error was modeled by a second-order autoregressive model, and error states were incorporated into extended Kalman filter (EKF) design. Through EKF filtering, the mean and standard deviation of error in the easting direction decreased from 4.05 to 2.21 cm and from 8.27 to 1.89 cm, respectively, while in the northing direction, they decreased from 4.64 to 1.81 cm and from 11 to 2.16 cm, respectively. The geo-referenced tree positions along the navigational paths were also recovered by using a K-means clustering algorithm, achieving an average error of tree position estimates of 4.4 cm. The developed sensor fusion algorithm was proven to be capable of recognizing and reconstructing the navigational environment of a simulated tree plantation, which offers a great potential in improving the applicability of an autonomous robot to operate in nursery tree plantations for operations such as intra-row mechanical weeding.

Comments

This article is from Transactions of the ASABE 49, no. 5 (2006): 1579–1588.

Description
Keywords
Citation
DOI
Source
Copyright
Sun Jan 01 00:00:00 UTC 2006
Collections