Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33122
Detecting Tomato Flowers in Greenhouses Using Computer Vision
Authors: Dor Oppenheim, Yael Edan, Guy Shani
Abstract:
This paper presents an image analysis algorithm to detect and count yellow tomato flowers in a greenhouse with uneven illumination conditions, complex growth conditions and different flower sizes. The algorithm is designed to be employed on a drone that flies in greenhouses to accomplish several tasks such as pollination and yield estimation. Detecting the flowers can provide useful information for the farmer, such as the number of flowers in a row, and the number of flowers that were pollinated since the last visit to the row. The developed algorithm is designed to handle the real world difficulties in a greenhouse which include varying lighting conditions, shadowing, and occlusion, while considering the computational limitations of the simple processor in the drone. The algorithm identifies flowers using an adaptive global threshold, segmentation over the HSV color space, and morphological cues. The adaptive threshold divides the images into darker and lighter images. Then, segmentation on the hue, saturation and volume is performed accordingly, and classification is done according to size and location of the flowers. 1069 images of greenhouse tomato flowers were acquired in a commercial greenhouse in Israel, using two different RGB Cameras – an LG G4 smartphone and a Canon PowerShot A590. The images were acquired from multiple angles and distances and were sampled manually at various periods along the day to obtain varying lighting conditions. Ground truth was created by manually tagging approximately 25,000 individual flowers in the images. Sensitivity analyses on the acquisition angle of the images, periods throughout the day, different cameras and thresholding types were performed. Precision, recall and their derived F1 score were calculated. Results indicate better performance for the view angle facing the flowers than any other angle. Acquiring images in the afternoon resulted with the best precision and recall results. Applying a global adaptive threshold improved the median F1 score by 3%. Results showed no difference between the two cameras used. Using hue values of 0.12-0.18 in the segmentation process provided the best results in precision and recall, and the best F1 score. The precision and recall average for all the images when using these values was 74% and 75% respectively with an F1 score of 0.73. Further analysis showed a 5% increase in precision and recall when analyzing images acquired in the afternoon and from the front viewpoint.Keywords: Agricultural engineering, computer vision, image processing, flower detection.
Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1128833
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2375References:
[1] K. Kapach, E. Barnea, R. Mairon, Y. Edan, O. Ben-Shahar, and O. Ben Shahar, “Computer vision for fruit harvesting robots state of the art and challenges ahead,” Int. J. Comput. Vis. Robot., vol. 3, no. 1/2, p. 4, 2012.
[2] S. A. Cameron, J. D. Lozier, J. P. Strange, J. B. Koch, N. Cordes, L. F. Solter, and T. L. Griswold, “Patterns of widespread decline in North American bumble bees,” Proc. Natl. Acad. Sci., vol. 108, no. 2, pp. 662–667, 2011.
[3] D. a Sumner and H. Boriss, “Bee-conomics and the Leap in Pollination Fees the Supply and Demand Issues and Operation of the Pollination Market,” pp. 9–11, 2006.
[4] R. Ward, A. Whyte, and R. R. James, “A Tale of Two Bees: Looking at Pollination Fees for Almonds and Sweet Cherries,” Am. Entomol., vol. 56, no. 3, pp. 170–177, 2010.
[5] R. Wood, R. Nagpal, and G.-Y. Wei, “Flight of the Robobees,” Sci. Am., vol. 308, no. 3, pp. 60–65, 2013.
[6] A. Gongal, S. Amatya, M. Karkee, Q. Zhang, and K. Lewis, “Sensors and systems for fruit detection and localization: A review,” Comput. Electron. Agric., vol. 116, pp. 8–19, 2015.
[7] A. R. Jiménez, R. Ceres, and J. L. Pons, “a Survey of Computer Vision Methods for Locating Fruit on Trees,” Trans. ASAE, vol. 43, no. 6, pp. 1911–1920, 2000.
[8] D. R. Martin, C. C. Fowlkes, and J. Malik, “Learning to detect natural image boundaries using local brightness, color, and texture cues,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 26, no. 5, pp. 530–549, 2004.
[9] H. D. Cheng, X. H. Jiang, Y. Sun, and J. Wang, “Color image segmentation: Advances and prospects,” Pattern Recognit., vol. 34, no. 12, pp. 2259–2281, 2001.
[10] M. Hočevar, B. Širok, T. Godeša, and M. Stopar, “Flowering estimation in apple orchards by image analysis,” Precis. Agric., vol. 15, no. 4, pp. 466–478, 2014.
[11] A. Payne, K. Walsh, P. Subedi, and D. Jarvis, “Estimating mango crop yield using image analysis using fruit at ‘stone hardening’ stage and night time imaging,” Comput. Electron. Agric., vol. 100, pp. 160–167, 2014.
[12] R. Linker, O. Cohen, and A. Naor, “Determination of the number of green apples in RGB images recorded in orchards,” Comput. Electron. Agric., vol. 81, pp. 45–57, 2012.
[13] C. Zhao, W. S. Lee, and D. He, “Immature green citrus detection based on colour feature and sum of absolute transformed difference (SATD) using colour images in the citrus grove,” Comput. Electron. Agric., vol. 124, pp. 243–253, 2016.
[14] K. R. Thorp and D. a. Dierig, “Color image segmentation approach to monitor flowering in lesquerella,” Ind. Crops Prod., vol. 34, no. 1, pp. 1150–1159, 2011.
[15] U. Dorj, K. Lee, and M. Lee, “A computer vision algorithm for tangerine yield estimation,” Int. J. Bio-Science Bio-Technology, vol. 5, no. 5, pp. 101–110, 2013.
[16] A. Aquino, B. Millan, S. Gutiérrez, and J. Tardáguila, “Grapevine flower estimation by applying artificial vision techniques on images with uncontrolled scene and multi-model analysis,” Comput. Electron. Agric., vol. 119, pp. 92–104, 2015.
[17] N. Bairwa and N. K. Agrawal, “Counting of Flowers using Image Processing,” Int. J. Eng. Res. Technol., vol. 3, no. 9, pp. 775–779, 2014.
[18] R. Kaur and S. Porwal, “An Optimized Computer Vision Approach to Precise Well-Bloomed Flower Yielding Prediction using Image Segmentation,” Int. J. Comput. Appl., vol. 119, no. 23, pp. 15–20, 2015.
[19] A. Plebe and G. Grasso, “Localization of spherical fruits for robotic harvesting,” Mach. Vis. Appl., vol. 13, pp. 70–79, 2001.
[20] L. Lucchese and S. K. Mitray, “Color image segmentation: A state-of-the-art survey,” Proc. Indian Natl. Sci. Acad. (INSA-A). Delhi, Indian Natl Sci Acad, vol. 67, pp. 207–221, 2001.