Automatic detection of phenological stages in Rosa spp. using YOLOv8 convolutional neural networks
Detección automática de estados fenológicos en Rosa spp. mediante redes neuronales convolucionales YOLOv8
DOI:
https://doi.org/10.15446/agron.colomb.v43n3.122164Keywords:
computer vision, deep learning, artificial intelligence in agriculture, plant phenology, greenhouse crops (en)visión computacional, aprendizaje profundo, inteligencia artificial aplicada a la agricultura, fenología vegetal, cultivos bajo invernadero (es)
Downloads
This study evaluated the performance of YOLOv8 convolutional neural network models for the automatic detection of phenological stages in greenhouse-grown cut roses (Rosa spp.). Image acquisition was conducted in a commercial greenhouse in Tocancipá, Colombia, using a ground-based mobile platform equipped with RGB cameras, thereby avoiding the operational limitations of unmanned aerial vehicles (UAVs) in enclosed environments. Images were collected during five sampling periods using a Nikon camera mounted on the mobile platform across five hydroponic benches, each divided into five 6.4-m plots, for a total of 25 plots. In total, 2,000 images and 4,653 annotated objects were obtained across 9 classes (8 phenological and 1 multipurpose). Model performance was evaluated using precision, recall, F1-score, mAP50, and mAP50–95. Individual models outperformed the multipurpose model, with the C_stage model achieving an F1-score of 0.87 in validation and 0.84 in testing. The multipurpose model required extending training to 200 epochs to achieve convergence, resulting in improved performance (F1-score = 0.75 and Precision = 0.78 in validation; F1-score and Precision = 0.72 in testing), indicating its potential for simultaneous multi-stage detection under greenhouse conditions. Correlation analysis showed that object size was the main factor influencing model performance (r ≥ 0.90). At the same time, the number of labeled samples per class had only a weak relationship with the metrics. This explained the higher accuracy in phenological stages with larger and more distinctive floral structures (C_stage, S_color) and the lower performance in early stages (rice, chickpea), whose buds occupied less than 0.3% of the image area.
Este estudio evaluó el desempeño de modelos de redes neuronales convolucionales YOLOv8 para la detección automática de estados fenológicos en rosas de corte (Rosa spp.) cultivadas bajo invernadero. La captura de imágenes se realizó en un invernadero comercial en Tocancipá, Colombia, utilizando una plataforma móvil terrestre equipada con cámaras RGB, evitando así las limitaciones operativas del uso de vehículos aéreos no tripulados (VAT) en espacios cerrados. Las imágenes se obtuvieron durante cinco periodos de muestreo, empleando una cámara Nikon montada en la plataforma a lo largo de cinco bancos hidropónicos, cada uno dividido en cinco parcelas de 6,4 m, para un total de 25 parcelas. En total, se recolectaron 2.000 imágenes y 4.653 objetos etiquetados en nueve clases (8 fenológicas y 1 clase multipropósito). El desempeño de los modelos se evaluó mediante precisión, exhaustividad (recall), F1-score, mAP50 y mAP50–95. Los modelos individuales superaron al modelo multipropósito, destacándose C_stage, que alcanzó un F1-score de 0,87 en validación y 0,84 en prueba. El modelo multipropósito requirió extender el entrenamiento hasta 200 épocas para lograr la convergencia, obteniendo un mejor desempeño (F1-score = 0,75 y Precisión = 0,78 en validación; F1-score y Precisión = 0,72 en prueba), lo que indicó su potencial para la detección simultánea de múltiples etapas fenológicas bajo condiciones de invernadero. El análisis de correlación mostró que el tamaño del objeto fue el principal factor que determinó el desempeño del modelo (r ≥ 0,90), mientras que el número de etiquetas por clase presentó una relación débil con las métricas. Esto explicó el mejor rendimiento en etapas con estructuras florales más grandes (C_stage, S_color) y el desempeño limitado en etapas tempranas (rice, chickpea), cuyos botones ocuparon menos del 0,3% del área de la imagen.
References
Ballena-Ruiz, J., Arcila-Diaz, J., & Tuesta-Monteza, V. (2025). Automated detection and counting of Gossypium barbadense fruits in Peruvian crops using convolutional neural networks. AgriEngineering, 7(5), Article 152. https://doi.org/10.3390/agriengineering7050152
Beloiu, M., Heinzmann, L., Rehush, N., Gessler, A., & Griess, V. C. (2023). Individual tree-crown detection and species identification in heterogeneous forests using aerial RGB imagery and deep learning. Remote Sensing, 15(5), Article 1463. https://doi.org/10.3390/rs15051463
Estrada, J. S., Vasconez, J. P., Fu, L., & Cheein, F. A. (2024). Deep learning based flower detection and counting in highly populated images: A peach grove case study. Journal of Agriculture and Food Research, 15, Article 100930. https://doi.org/10.1016/j.jafr.2023.100930
Franco Montoya, O. H., & Martínez Martínez, L. J. (2024). Relationship between spectral response and manganese concentrations for assessment of the nutrient status in rose crop. Agronomía Colombiana, 42(2), Article e110294. https://doi.org/10.15446/agron.colomb.v42n2.110294
Franco Montoya, O. H., & Martínez Martínez, L. J. (2025). Comparing spectral models to predict manganese content in Rosa spp. leaves using VIS-NIR data. Agronomía Colombiana, 43(1), Article e118322. https://doi.org/10.15446/agron.colomb.v43n1.118322
ICA – Instituto Colombiano Agropecuario. (2024). Con 700 millones de tallos, Colombia aporta variedad, color y belleza a la celebración de San Valentín. https://www.ica.gov.co/noticias/ica-colombia-exporta-flores-san-valentin-2024
John, A., Theobald, E. J., Cristea, N., Tan, A., & Hille Ris Lambers, J. (2024). Using photographs and deep neural networks to understand flowering phenology and diversity in mountain meadows. Remote Sensing in Ecology and Conservation, 10(4), 480–499. https://doi.org/10.1002/rse2.382
Li, J., Li, Y., Qiao, J., Li, L., Wang, X., Yao, J., & Liao, G. (2023). Automatic counting of rapeseed inflorescences using deep learning method and UAV RGB imagery. Frontiers in Plant Science, 14, Article 1101143. https://doi.org/10.3389/fpls.2023.1101143
Mann, H. M., Iosifidis, A., Jepsen, J. U., Welker, J. M., Loonen, M. J., & Høye, T. T. (2022). Automatic flower detection and phenology monitoring using time‐lapse cameras and deep learning. Remote Sensing in Ecology and Conservation, 8(6), 765–777. https://doi.org/10.1002/rse2.275
Pajula, M. (2022). Berry density estimation with deep learning - Estimating density of bilberry and lingonberry harvest with object detection [Master thesis, Oulu University of Applied Sciences]. https://www.theseus.fi/bitstream/handle/10024/786112/mikko_pajula.pdf?sequence=2
Qi, C., Nyalala, I., & Chen, K. (2021). Detecting the early flowering stage of tea chrysanthemum using the F-YOLO model. Agronomy, 11(5), Article 834. https://doi.org/10.3390/agronomy11050834
Sambasivam, G., & Opiyo, G. D. (2021). A predictive machine learning application in agriculture: Cassava disease detection and classification with imbalanced dataset using convolutional neural networks. Egyptian Informatics Journal, 22(1), 27−34. https://doi.org/10.1016/j.eij.2020.02.007
Wang, X. A., Tang, J., & Whitty, M. (2021). DeepPhenology: Estimation of apple flower phenology distributions based on deep learning. Computers and Electronics in Agriculture, 185, Article 106123. https://doi.org/10.1016/j.compag.2021.106123
Xiang, S., Wang, S., Xu, M., Wang, W., & Liu, W. (2023). YOLO POD: a fast and accurate multi-task model for dense Soybean Pod counting. Plant Methods, 19(1), Article 8. https://doi.org/10.1186/s13007-023-00985-4
Zhou, X., Sun, G., Xu, N., Zhang, X., Cai, J., Yuan, Y., & Huang, Y. (2023). A method of modern standardized apple orchard flowering monitoring based on S-YOLO. Agriculture, 13(2), Article 380. https://doi.org/10.3390/agriculture13020380
How to Cite
APA
ACM
ACS
ABNT
Chicago
Harvard
IEEE
MLA
Turabian
Vancouver
Download Citation
License
Copyright (c) 2025 Agronomía Colombiana

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
© Centro Editorial de la Facultad de Ciencias Agrarias, Universidad Nacional de Colombia
Reproduction and quotation of material appearing in the journal is authorized provided the following are explicitly indicated: journal name, author(s) name, year, volume, issue and pages of the source. The ideas and observations recorded by the authors are their own and do not necessarily represent the views and policies of the Universidad Nacional de Colombia. Mention of products or commercial firms in the journal does not constitute a recommendation or endorsement on the part of the Universidad Nacional de Colombia; furthermore, the use of such products should comply with the product label recommendations.
The Creative Commons license used by Agronomia Colombiana journal is: Attribution - NonCommercial - ShareAlike (by-nc-sa)

Agronomia Colombiana by Centro Editorial of Facultad de Ciencias Agrarias, Universidad Nacional de Colombia is licensed under a Creative Commons Reconocimiento-NoComercial-CompartirIgual 4.0 Internacional License.
Creado a partir de la obra en http://revistas.unal.edu.co/index.php/agrocol/.







