High-throughput image-based plant stand count estimation using convolutional neural networks

Thumbnail Image
Date
2022-07-28
Authors
Khaki, Saeed
Pham, Hieu
Khalilzadeh, Zahra
Masoud, Arezoo
Safaei, Nima
Han, Ye
Kent, Wade
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
PLOS
Authors
Person
Research Projects
Organizational Units
Journal Issue
Is Version Of
Versions
Series
Department
Industrial and Manufacturing Systems Engineering
Abstract
The landscape of farming and plant breeding is rapidly transforming due to the complex requirements of our world. The explosion of collectible data has started a revolution in agriculture to the point where innovation must occur. To a commercial organization, the accurate and efficient collection of information is necessary to ensure that optimal decisions are made at key points of the breeding cycle. In particular, recent technology has enabled organizations to capture in-field images of crops to record color, shape, chemical properties, and disease susceptibility. However, this new challenge necessitates the need for advanced algorithms to accurately identify phenotypic traits. This work, advanced the current literature by developing an innovative deep learning algorithm, named DeepStand, for image-based counting of corn stands at early phenological stages. The proposed method adopts a truncated VGG-16 network to act as a feature extractor backbone. We then combine multiple feature maps with different dimensions to ensure the network is robust against size variation. Our extensive computational experiments demonstrate that our DeepStand framework accurately identifies corn stands and out-performs other cutting-edge methods.
Comments
This article is published as Khaki, Saeed, Hieu Pham, Zahra Khalilzadeh, Arezoo Masoud, Nima Safaei, Ye Han, Wade Kent, and Lizhi Wang. "High-throughput image-based plant stand count estimation using convolutional neural networks." PLoS ONE 17, no. 7 (2022): e0268762. DOI: 10.1371/journal.pone.0268762. Copyright 2022 Khaki et al. Attribution 4.0 International (CC BY 4.0). Posted with permission.
Description
Keywords
Citation
DOI
Copyright
Collections