Grapevine buds detection and localization in 3D space based on Structure from Motion and 2D image classification
Title | Grapevine buds detection and localization in 3D space based on Structure from Motion and 2D image classification |
Publication Type | Journal Article |
Year of Publication | 2018 |
Authors | Diaz CAriel, Pérez DSebastián, Miatello H, Bromberg F |
Journal | Computers in Industry |
Volume | 99C |
Issue | Special Issue on Machine Vision for Outdoor Environments |
Start Page | 303 |
Pagination | 303-312 |
Date Published | 04/2018 |
ISSN | 0166-3615 |
Keywords | computer vision, Grapevine bud detection, Precision viticulture |
Abstract | In viticulture, there are several applications where 3D bud detection and localization in vineyards is a necessary task susceptible to automation: measurement of sunlight exposure, autonomous pruning, bud counting, type-of-bud classification, bud geometric characterization, internode length, and bud development stage. This paper presents a workflow to achieve quality 3D localizations of grapevine buds based on well-known computer vision and machine learning algorithms when provided with images captured in natural field conditions (i.e., natural sunlight and the addition of no artificial elements), during the winter season and using a mobile phone RGB camera. Our pipeline combines the Oriented FAST and Rotated BRIEF (ORB) for keypoint detection, a Fast Local Descriptor for Dense Matching (DAISY) for describing the keypoint, and the Fast Approximate Nearest Neighbor (FLANN) technique for matching keypoints, with the Structure from Motion multi-view scheme for generating consistent 3D point clouds. Next, it uses a 2D scanning window classifier based on Bag of Features and Support Vectors Machine for classification of 3D points in the cloud. Finally, the Density-Based Spatial Clustering of Applications with Noise (DBSCAN) for 3D bud localization is applied. Our approach resulted in a maximum precision of 1.0 (i.e., no false detections), a maximum recall of 0.45 (i.e. 45% of the buds detected), and a localization error within the range of 259-554 pixels (corresponding to approximately 3 bud diameters, or 1.5cm) when evaluated over the whole range of user-given parameters of workflow components. |
URL | https://www.sciencedirect.com/science/article/pii/S0166361517304815 |
DOI | 10.1016/j.compind.2018.03.033 |