Automated crop plant detection based on the fusion of color and depth images for robotic weed control

dc.contributor.author Tang, Lie
dc.contributor.author Gai, Jingyao
dc.contributor.author Steward, Brian
dc.contributor.author Tang, Lie
dc.contributor.author Steward, Brian
dc.contributor.department Agricultural and Biosystems Engineering
dc.contributor.department Human Computer Interaction
dc.contributor.department Plant Sciences Institute
dc.date 2019-09-21T00:04:55.000
dc.date.accessioned 2020-06-29T22:36:37Z
dc.date.available 2020-06-29T22:36:37Z
dc.date.copyright Tue Jan 01 00:00:00 UTC 2019
dc.date.embargo 2020-07-23
dc.date.issued 2019-07-23
dc.description.abstract <p>Robotic weeding enables weed control near or within crop rows automatically, precisely and effectively. A computer‐vision system was developed for detecting crop plants at different growth stages for robotic weed control. Fusion of color images and depth images was investigated as a means of enhancing the detection accuracy of crop plants under conditions of high weed population. In‐field images of broccoli and lettuce were acquired 3–27 days after transplanting with a Kinect v2 sensor. The image processing pipeline included data preprocessing, vegetation pixel segmentation, plant extraction, feature extraction, feature‐based localization refinement, and crop plant classification. For the detection of broccoli and lettuce, the color‐depth fusion algorithm produced high true‐positive detection rates (91.7% and 90.8%, respectively) and low average false discovery rates (1.1% and 4.0%, respectively). Mean absolute localization errors of the crop plant stems were 26.8 and 7.4 mm for broccoli and lettuce, respectively. The fusion of color and depth was proved beneficial to the segmentation of crop plants from background, which improved the average segmentation success rates from 87.2% (depth‐based) and 76.4% (color‐based) to 96.6% for broccoli, and from 74.2% (depth‐based) and 81.2% (color‐based) to 92.4% for lettuce, respectively. The fusion‐based algorithm had reduced performance in detecting crop plants at early growth stages.</p>
dc.description.comments <p>This is the peer-reviewed version of the following article: Gai, Jingyao, Lie Tang, and Brian L. Steward. "Automated crop plant detection based on the fusion of color and depth images for robotic weed control." <em>Journal of Field Robotics</em> (2019), which has been published in final form at DOI: <a href="http://dx.doi.org/10.1002/rob.21897" target="_blank">10.1002/rob.21897</a>. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.</p>
dc.format.mimetype application/pdf
dc.identifier archive/lib.dr.iastate.edu/abe_eng_pubs/1065/
dc.identifier.articleid 2349
dc.identifier.contextkey 15362855
dc.identifier.s3bucket isulib-bepress-aws-west
dc.identifier.submissionpath abe_eng_pubs/1065
dc.identifier.uri https://dr.lib.iastate.edu/handle/20.500.12876/768
dc.language.iso en
dc.source.bitstream archive/lib.dr.iastate.edu/abe_eng_pubs/1065/2019_TangLie_AutomatedCrop.pdf|||Fri Jan 14 18:25:24 UTC 2022
dc.source.uri 10.1002/rob.21897
dc.subject.disciplines Agriculture
dc.subject.disciplines Bioresource and Agricultural Engineering
dc.subject.keywords computer vision
dc.subject.keywords crop detection
dc.subject.keywords robotic weeding
dc.subject.keywords sensor fusion
dc.title Automated crop plant detection based on the fusion of color and depth images for robotic weed control
dc.type article
dc.type.genre article
dspace.entity.type Publication
relation.isAuthorOfPublication e60e10a5-8712-462a-be4b-f486a3461aea
relation.isAuthorOfPublication ef71fa01-eb3e-4e29-ade7-bcb38f2968b0
relation.isOrgUnitOfPublication 8eb24241-0d92-4baf-ae75-08f716d30801
File
Original bundle
Now showing 1 - 1 of 1
Name:
2019_TangLie_AutomatedCrop.pdf
Size:
2.92 MB
Format:
Adobe Portable Document Format
Description:
Collections