Grapevine buds detection and localization in 3D space based on Structure from Motion and 2D image classification

TítuloGrapevine buds detection and localization in 3D space based on Structure from Motion and 2D image classification
Publication TypeJournal Article
Year of Publication2018
AuthorsDiaz CAriel, Pérez DSebastián, Miatello H, Bromberg F
JournalComputers in Industry
Volume99C
IssueSpecial Issue on Machine Vision for Outdoor Environments
Start Page303
Pagination303-312
Date Published04/2018
ISSN0166-3615
Palabras clavecomputer vision, Grapevine bud detection, Precision viticulture
Abstract

In viticulture, there are several applications where 3D bud detection and localization in vineyards is a necessary task susceptible to automation: measurement of sunlight exposure, autonomous pruning, bud counting, type-of-bud classification, bud geometric characterization, internode length, and bud development stage. This paper presents a workflow to achieve quality 3D localizations of grapevine buds based on well-known computer vision and machine learning algorithms when provided with images captured in natural field conditions (i.e., natural sunlight and the addition of no artificial elements), during the winter season and using a mobile phone RGB camera. Our pipeline combines the Oriented FAST and Rotated BRIEF (ORB) for keypoint detection, a Fast Local Descriptor for Dense Matching (DAISY) for describing the keypoint, and the Fast Approximate Nearest Neighbor (FLANN) technique for matching keypoints, with the Structure from Motion multi-view scheme for generating consistent 3D point clouds. Next, it uses a 2D scanning window classifier based on Bag of Features and Support Vectors Machine for classification of 3D points in the cloud. Finally, the Density-Based Spatial Clustering of Applications with Noise (DBSCAN) for 3D bud localization is applied. Our approach resulted in a maximum precision of 1.0 (i.e., no false detections), a maximum recall of 0.45 (i.e. 45% of the buds detected), and a localization error within the range of 259-554 pixels (corresponding to approximately 3 bud diameters, or 1.5cm) when evaluated over the whole range of user-given parameters of workflow components.

URLhttps://www.sciencedirect.com/science/article/pii/S0166361517304815
DOI10.1016/j.compind.2018.03.033
Miembros del DHARMa que son autores:: 
Peer reviewed?: 
1
Internacional?: 
1