8. Multi species weed detection with Retinanet one-step network in a maize field

In: Precision agriculture '21
Authors:
J.M. López Correa CSIC, Centre for Automation and Robotics, Spanish National Research Council, CSIC-UPM, Ctra. M300 Campo Real, Km 0.200, Arganda del Rey, 28500 Madrid, Spain.
DHARMa, Artificial Intelligence Laboratory, UTN, Coronel Rodríguez 273, Ciudad – Capital, M5500 Mendoza, Argentina.

Search for other papers by J.M. López Correa in
Current site
Google Scholar
PubMed
Close
,
M. Todeschini CSIC, Centre for Automation and Robotics, Spanish National Research Council, CSIC-UPM, Ctra. M300 Campo Real, Km 0.200, Arganda del Rey, 28500 Madrid, Spain.

Search for other papers by M. Todeschini in
Current site
Google Scholar
PubMed
Close
,
D.S. Pérez DHARMa, Artificial Intelligence Laboratory, UTN, Coronel Rodríguez 273, Ciudad – Capital, M5500 Mendoza, Argentina.

Search for other papers by D.S. Pérez in
Current site
Google Scholar
PubMed
Close
,
J. Karouta CSIC, Centre for Automation and Robotics, Spanish National Research Council, CSIC-UPM, Ctra. M300 Campo Real, Km 0.200, Arganda del Rey, 28500 Madrid, Spain.

Search for other papers by J. Karouta in
Current site
Google Scholar
PubMed
Close
,
F. Bromberg CSIC, Centre for Automation and Robotics, Spanish National Research Council, CSIC-UPM, Ctra. M300 Campo Real, Km 0.200, Arganda del Rey, 28500 Madrid, Spain.

Search for other papers by F. Bromberg in
Current site
Google Scholar
PubMed
Close
,
A. Ribeiro CSIC, Centre for Automation and Robotics, Spanish National Research Council, CSIC-UPM, Ctra. M300 Campo Real, Km 0.200, Arganda del Rey, 28500 Madrid, Spain.

Search for other papers by A. Ribeiro in
Current site
Google Scholar
PubMed
Close
, and
D. Andújar CSIC, Centre for Automation and Robotics, Spanish National Research Council, CSIC-UPM, Ctra. M300 Campo Real, Km 0.200, Arganda del Rey, 28500 Madrid, Spain.

Search for other papers by D. Andújar in
Current site
Google Scholar
PubMed
Close

Purchase instant access (PDF download and unlimited online access):

$40.00

Weed density and composition are not uniform throughout the field, nevertheless, the conventional approach is to carry out a uniform application. Object Detection Networks have already arrived in agricultural applications that can be used for weed management. The current study developed a detection and classification of weeds system in a one-step procedure using RetinaNet Object Detection Network. The procedure was based on identifying Solanum nigrum L., Cyperus rotundus L. and Echinochloa crus-galli L. and two growth stages both for a broadleaf species (S. nigrum) as well as narrow-leaved species (C. rotundus) in a maize field. The predictions were evaluated by mAP metric. The result obtained was 0.88 with values between 0.98 and 0.75 depending on the class.

  • Collapse
  • Expand
  • Dyrmann, M., Karstoft, H., & Midtiby, H.S. (2016). Plant species classification using deep convolutional neural networks. Biosystems Engineering, 151, 72-80.

  • Dyrmann, M. (2017). Automatic detection and classification of weed seedlings under natural light conditions (Doctoral dissertation, University of Southern Denmark).

  • EPPO code system. EPPO Plant Protection Thesaurus. http://eppt.eppo.org/

  • Everingham, M., Van Gool, L., Williams, C.K., Winn, J., & Zisserman, A. (2010). The pascal visual object classes (voc) challenge. International Journal of Computer Vision, 88(2), 303-338.

  • Fernández-Quintanilla, C., Peña, J.M., Andújar, D., Dorado, J., Ribeiro, A., and López-Granados, F. (2018). Is the current state of the art of weed monitoring suitable for site-specific weed management in arable crops? Weed Research, 58, 259–272.

  • Gaiser, H., de Vries, M., Lacatusu, V., & Williamson, A. (2019). fizyr/keras-retinanet 0.5. 1. Zenodo.

  • He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and pattern recognition, pp. 770-778.

  • Kamilaris, A. and Prenafeta-Boldú, F.X. (2018). Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 147, 70-90.

  • Keras (2015). https://keras.io/.

  • Koirala, A., Walsh, K. B., Wang, Z., & McCarthy, C. (2019). Deep learning for real-time fruit detection and orchard fruit load estimation: Benchmarking of ‘MangoYOLO’. Precision Agriculture, 20(6), 1107-1135.

  • Lin, T.Y., Maire, M., Belongie, S., Hays, J., Perona, P., Ramanan, D. et al. (2014). Microsoft coco: Common objects in context. In European conference on computer vision. Cham, Switzerland: Springer, pp. 740-755

  • Lin, T.Y., Goyal, P., Girshick, R., He, K., & Dollár, P. (2017). Focal loss for dense object detection. In Proceedings of the IEEE international conference on computer vision (pp. 2980-2988).

  • Padilla, R., Netto, S.L., & da Silva, E.A. (2020). A survey on performance metrics for object-detection algorithms. In 2020 International Conference on Systems, Signals and Image Processing (IWSSIP) IEEE, pp. 237-242.

  • Pan, S.J. and Yang, Q. (2010). A Survey on Transfer Learning. IEEE Transactions on Knowledge and Data Engineering. 22, 1345-1359. https://doi.org/10.1109/tkde.2009.191.

  • Pérez-Ortiz, M., Peña, J.M., Gutiérrez, P.A., Torres-Sánchez, J., Hervás-Martínez, C., and López-Granados, F. (2016). Selecting patterns and features for between- and within-crop-row weed mapping using UAV-imagery. Expert Systems with Applications, 47, 85-94.

  • Quan, L., Feng, H., Lv, Y., Wang, Q., Zhang, C., Liu, J., et al. (2019). Maize seedling detection under different growth stages and complex field environments based on an improved Faster R–CNN. Biosystems Engineering, 184, 1-23.

  • Rançon, F., Bombrun, L., Keresztes, B., & Germain, C. (2019). Comparison of sift encoded and deep learning features for the classification and detection of Esca disease in Bordeaux vineyards. Remote Sensing, 11(1), 1.

  • Redmon, J., & Farhadi, A. (2017). YOLO9000: better, faster, stronger. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 7263-7271.

  • Ritchie, S.W., Hanway, J.J., & Benson, G.O. (1993). How a corn plant develops. Iowa State University Cooperative Extension Service Special Report 48, 21.

  • Shorten, C., & Khoshgoftaar, T.M. (2019). A survey on image data augmentation for deep learning. Journal of Big Data, 6(1), 60.

  • Tang, J. L., Chen, X. Q., Miao, R. H., & Wang, D. (2016). Weed detection using image processing under different illumination for site-specific areas spraying. Computers and Electronics in Agriculture, 122, 103-111.

  • Tzutalin, D. (2015). LabelImg. Git code. https://github.com/tzutalin/labelImg

  • Wei, Q., Peng, J., Zhang, H., Mo, H., & Qin, Y. (2020). October. Design and Implementation of ROS-Based Rapid Identification Robot System. In International Conference on Computer Engineering and Networks. Springer, Singapore. pp. 376-387.

  • Zheng, Y.Y., Kong, J.L., Jin, X.B., Su, T.L., Nie, M.J., & Bai, Y.T. (2018). Real-Time Vegetables Recognition System based on Deep Learning Network for Agricultural Robots. In 2018 Chinese Automation Congress (CAC). IEEE, pp. 2223-2228.

Metrics

All Time Past 365 days Past 30 Days
Abstract Views 95 69 14
Full Text Views 0 0 0
PDF Views & Downloads 1 0 0