Show simple item record

dc.contributor.authorParr, Ben_US
dc.contributor.authorLegg, Men_US
dc.contributor.authorAlam, Fen_US
dc.coverage.spatialSwitzerlanden_US
dc.date.available2022-05-31en_US
dc.date.available2022-05-30en_US
dc.date.issued2022-05-31en_US
dc.identifierhttps://www.ncbi.nlm.nih.gov/pubmed/35684799en_US
dc.identifiers22114179en_US
dc.identifier.citationSensors (Basel), 2022, 22 (11)en_US
dc.description.abstractThis work investigates the performance of five depth cameras in relation to their potential for grape yield estimation. The technologies used by these cameras include structured light (Kinect V1), active infrared stereoscopy (RealSense D415), time of flight (Kinect V2 and Kinect Azure), and LiDAR (Intel L515). To evaluate their suitability for grape yield estimation, a range of factors were investigated including their performance in and out of direct sunlight, their ability to accurately measure the shape of the grapes, and their potential to facilitate counting and sizing of individual berries. The depth cameras' performance was benchmarked using high-resolution photogrammetry scans. All the cameras except the Kinect V1 were able to operate in direct sunlight. Indoors, the RealSense D415 camera provided the most accurate depth scans of grape bunches, with a 2 mm average depth error relative to photogrammetric scans. However, its performance was reduced in direct sunlight. The time of flight and LiDAR cameras provided depth scans of grapes that had about an 8 mm depth bias. Furthermore, the individual berries manifested in the scans as pointed shape distortions. This led to an underestimation of berry sizes when applying the RANSAC sphere fitting but may help with the detection of individual berries with more advanced algorithms. Applying an opaque coating to the surface of the grapes reduced the observed distance bias and shape distortion. This indicated that these are likely caused by the cameras' transmitted light experiencing diffused scattering within the grapes. More work is needed to investigate if this distortion can be used for enhanced measurement of grape properties such as ripeness and berry size.en_US
dc.languageengen_US
dc.relation.urihttps://www.mdpi.com/1424-8220/22/11/4179en_US
dc.rights(CC BY)en_US
dc.subjectRGB-Den_US
dc.subjectdepth camerasen_US
dc.subjectgrapesen_US
dc.subjectyield estimationen_US
dc.subjectAlgorithmsen_US
dc.subjectFruiten_US
dc.subjectVitisen_US
dc.titleAnalysis of Depth Cameras for Proximal Sensing of Grapes.en_US
dc.typeJournal Article
dc.citation.volume22en_US
dc.identifier.doi10.3390/s22114179en_US
dc.identifier.elements-id453512
dc.relation.isPartOfSensors (Basel)en_US
dc.citation.issue11en_US
dc.identifier.eissn1424-8220en_US
dc.description.publication-statusPublished onlineen_US
pubs.organisational-group/Massey University
pubs.organisational-group/Massey University/College of Sciences
pubs.organisational-group/Massey University/College of Sciences/School of Food and Advanced Technology
dc.identifier.harvestedMassey_Dark
pubs.notesNot knownen_US
dc.subject.anzsrc0301 Analytical Chemistryen_US
dc.subject.anzsrc0805 Distributed Computingen_US
dc.subject.anzsrc0906 Electrical and Electronic Engineeringen_US
dc.subject.anzsrc0502 Environmental Science and Managementen_US
dc.subject.anzsrc0602 Ecologyen_US


Files in this item

FilesSizeFormatView

This item appears in the following Collection(s)

Show simple item record