首页 » 文章 » 文章详细信息
Mobile Information Systems Volume 2019 ,2019-07-10
Application of Deep Learning in Integrated Pest Management: A Real-Time System for Detection and Diagnosis of Oilseed Rape Pests
Research Article
Yong He 1 , 2 , 3 Hong Zeng 1 , 2 , 3 Yangyang Fan 1 , 2 , 3 Shuaisheng Ji 1 , 2 , 3 Jianjian Wu 1 , 2 , 3
Show affiliations
DOI:10.1155/2019/4570808
Received 2019-02-22, accepted for publication 2019-06-23, Published 2019-06-23
PDF
摘要

In this paper, we proposed an approach to detect oilseed rape pests based on deep learning, which improves the mean average precision (mAP) to 77.14%; the result increased by 9.7% with the original model. We adopt this model to mobile platform to let every farmer able to use this program, which will diagnose pests in real time and provide suggestions on pest controlling. We designed an oilseed rape pest imaging database with 12 typical oilseed rape pests and compared the performance of five models, SSD w/Inception is chosen as the optimal model. Moreover, for the purpose of the high mAP, we have used data augmentation (DA) and added a dropout layer. The experiments are performed on the Android application we developed, and the result shows that our approach surpasses the original model obviously and is helpful for integrated pest management. This application has improved environmental adaptability, response speed, and accuracy by contrast with the past works and has the advantage of low cost and simple operation, which are suitable for the pest monitoring mission of drones and Internet of Things (IoT).

授权许可

Copyright © 2019 Yong He et al. 2019
This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

通讯作者

Yong He.College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China, zju.edu.cn;State Key Laboratory of Modern Optical Instrumentation, Zhejiang University, Hangzhou 310058, China, zju.edu.cn;Key Laboratory of Spectroscopy Sensing, Ministry of Agriculture and Rural Affairs, Hangzhou 310058, China, agri.gov.cn.yhe@zju.edu.cn

推荐引用方式

Yong He,Hong Zeng,Yangyang Fan,Shuaisheng Ji,Jianjian Wu. Application of Deep Learning in Integrated Pest Management: A Real-Time System for Detection and Diagnosis of Oilseed Rape Pests. Mobile Information Systems ,Vol.2019(2019)

您觉得这篇文章对您有帮助吗?
分享和收藏
0

是否收藏?

参考文献
[1] J. Li, H.-C. Wong, S.-L. Lo, Y. Xin. et al.(2018). Multiple object detection by a deformable part-based model and an R-CNN. IEEE Signal Processing Letters.25(2):288-292. DOI: 10.1016/s0300-483x(00)00452-2.
[2] W. Ding, G. Taylor. (2016). Automatic moth detection from trap images for pest management. Computers and Electronics in Agriculture.123:17-28. DOI: 10.1016/s0300-483x(00)00452-2.
[3] W. Liu, D. Anguelov, D. Erhan. SSD: single shot multibox detector. :21-37. DOI: 10.1016/s0300-483x(00)00452-2.
[4] Y. C. Zhou, T. Y. Xu, W. Zheng, H. B. Deng. et al.(2017). Classification and recognition approaches of tomato main organs based on DCNN. Transactions of the Chinese Society of Agricultural Engineering.33:219-226. DOI: 10.1016/s0300-483x(00)00452-2.
[5] M. Martineau, D. Conte, R. Raveaux, I. Arnault. et al.(2017). A survey on image-based insect classification. Pattern Recognition.65:273-284. DOI: 10.1016/s0300-483x(00)00452-2.
[6] I. Sa, Z. Ge, F. Dayoub, B. Upcroft. et al.(2016). Deep fruits: a fruit detection system using deep neural networks. Sensors.16(8):1222. DOI: 10.1016/s0300-483x(00)00452-2.
[7] I. Bechar, S. Moisan. On-line counting of pests in a greenhouse using computer vision. . DOI: 10.1016/s0300-483x(00)00452-2.
[8] S. Hwang, H.-E. Kim. (2016). Self-transfer learning for fully weakly supervised object localization. . DOI: 10.1016/s0300-483x(00)00452-2.
[9] J. Huang, V. Rathod, C. Sun. (2016). Speed/accuracy trade-offs for modern convolutional object detectors. . DOI: 10.1016/s0300-483x(00)00452-2.
[10] Q. Yao, J. Lv, Q.-J. Liu. (2012). An insect imaging system to automate rice light-trap pest identification. Journal of Integrative Agriculture.11(6):978-985. DOI: 10.1016/s0300-483x(00)00452-2.
[11] J. Ubbens, M. Cieslak, P. Prusinkiewicz, I. Stavness. et al.(2018). The use of plant models in deep learning: an application to leaf counting in rosette plants. Plant Methods.14(1):6. DOI: 10.1016/s0300-483x(00)00452-2.
[12] A. Shrivastava, A. Gupta. Contextual priming and feedback for faster R-CNN. :330-348. DOI: 10.1016/s0300-483x(00)00452-2.
[13] Y. Zhong, J. Gao, Q. Lei, Y. Zhou. et al.(2018). A vision-based counting and recognition system for flying insects in intelligent agriculture. Sensors.18(5):1489. DOI: 10.1016/s0300-483x(00)00452-2.
[14] L. Taylor, G. Nitschke. (2017). Improving deep learning using generic data augmentation. . DOI: 10.1016/s0300-483x(00)00452-2.
[15] X. Y. Zhu, Y. F. Liu, J. H. Li, T. Wan. et al.Emotion classification with data augmentation using generative adversarial networks. :349-360. DOI: 10.1016/s0300-483x(00)00452-2.
[16] C. Szegedy, W. Liu, Y. Jia. Going deeper with convolutions. :1-9. DOI: 10.1016/s0300-483x(00)00452-2.
[17] A. Voulodimos, N. Doulamis, A. Doulamis. (2018). Deep learning for computer vision: a brief review. Computational Intelligence and Neuroscience.2018-13. DOI: 10.1016/s0300-483x(00)00452-2.
[18] A. Fuentes, D. H. Im, S. Yoon, D. S. Park. et al.Spectral analysis of CNN for tomato disease identification. :40-51. DOI: 10.1016/s0300-483x(00)00452-2.
[19] Y. Lecun, Y. Bengio, G. Hinton. (2015). Deep learning. Nature.521(7553):436-444. DOI: 10.1016/s0300-483x(00)00452-2.
[20] . DOI: 10.1016/s0300-483x(00)00452-2.
[21] H. Wang, J. Peng, C. Xie, Y. Bao. et al.(2015). Fruit quality evaluation using spectroscopy technology: a review. Sensors.15(5):11889-11927. DOI: 10.1016/s0300-483x(00)00452-2.
[22] D. J. Ecobichon. (2001). Pesticide use in developing countries. Toxicology.160(1–3):27-33. DOI: 10.1016/s0300-483x(00)00452-2.
[23] Y. Yin, H. Z. Wang, X. Liao. (2009). Analysis and strategy for 2009 rapeseed industry development in China. Chinese Journal of Oil Crop Sciences.31:259-262. DOI: 10.1016/s0300-483x(00)00452-2.
[24] X. Wei, F. Liu, Z. Qiu, Y. Shao. et al.(2014). Ripeness classification of astringent persimmon using hyperspectral imaging technique. Food and Bioprocess Technology.7(5):1371-1380. DOI: 10.1016/s0300-483x(00)00452-2.
[25] A. King. (2017). Technology: the future of agriculture. Nature.544(7651):S21-S23. DOI: 10.1016/s0300-483x(00)00452-2.
[26] L. S. Fu, Y. L. Feng, E. Tola, Z. H. Liu. et al.(2018). Image recognition method of multi-cluster kiwifruit in field based on convolutional neural networks. Transactions of the Chinese Society of Agricultural Engineering.34:205-211. DOI: 10.1016/s0300-483x(00)00452-2.
[27] D. Pimentel, H. Acquay, M. Biltonen. (1992). Environmental and economic costs of pesticide use. Bioscience.42(10):750-760. DOI: 10.1016/s0300-483x(00)00452-2.
[28] T.-Y. Lin, M. Maire, S. Belongie. Microsoft coco: common objects in context. :740-755. DOI: 10.1016/s0300-483x(00)00452-2.
[29] I. Rebai, Y. BenAyed, W. Mahdi, J. P. Lorre. et al.Improving speech recognition using data augmentation and acoustic model fusion. :316-322. DOI: 10.1016/s0300-483x(00)00452-2.
[30] O. Russakovsky, J. Deng, H. Su. (2015). Imagenet large scale visual recognition challenge. International Journal of Computer Vision.115(3):211-252. DOI: 10.1016/s0300-483x(00)00452-2.
[31] M. Everingham, L. Van Gool, C. K. I. Williams, J. Winn. et al.(2010). The pascal visual object classes (VOC) challenge. International Journal of Computer Vision.88(2):303-338. DOI: 10.1016/s0300-483x(00)00452-2.
[32] Z. Q. Zhao, P. Zheng, S. T. Xu, X. D. Wu. et al.(2018). Object detection with deep learning: a review. . DOI: 10.1016/s0300-483x(00)00452-2.
[33] A. Godil, R. Bostelman, W. Shackleford, T. Hong. et al.(2014). Performance metrics for evaluating object and human detection and tracking systems. . DOI: 10.1016/s0300-483x(00)00452-2.
[34] Y. Guo, Y. Liu, A. Oerlemans, S. Lao. et al.(2016). Deep learning for visual understanding: a review. Neurocomputing.187:27-48. DOI: 10.1016/s0300-483x(00)00452-2.
[35] F. Liu, Y. He. (2008). Classification of brands of instant noodles using Vis/NIR spectroscopy and chemometrics. Food Research International.41(5):562-567. DOI: 10.1016/s0300-483x(00)00452-2.
[36] X. Chen, Z. Wu, J. Yu. (2018). TSSD: temporal single-shot detector based on attention and LSTM for robotic intelligent perception. . DOI: 10.1016/s0300-483x(00)00452-2.
[37] Z. Y. Liu, J. F. Gao, G. G. Yang, H. Zhang. et al.(2016). Localization and classification of paddy field pests using a saliency map and deep convolutional neural network. Scientific Reports.6(1). DOI: 10.1016/s0300-483x(00)00452-2.
[38] J. Redmon, S. Divvala, R. Girshick, A. Farhadi. et al.You only look once: unified, real-time object detection. :779-788. DOI: 10.1016/s0300-483x(00)00452-2.
[39] J. Davis, M. Goadrich. The relationship between precision-recall and ROC curves. :233-240. DOI: 10.1016/s0300-483x(00)00452-2.
[40] S. Gossain, J. S. Gill. (2014). A novel approach to enhance object detection using integrated detection algorithms. International Journal of Computer Science and Mobile Computing.3:1018-1023. DOI: 10.1016/s0300-483x(00)00452-2.
[41] J. Dai, Y. Li, K. He, J. Sun. et al.(2016). Object detection via region-based fully convolutional networks. . DOI: 10.1016/s0300-483x(00)00452-2.
[42] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever. et al.(2014). Dropout: a simple way to prevent neural networks from overfitting. Journal of Machine Learning Research.15:1929-1958. DOI: 10.1016/s0300-483x(00)00452-2.
[43] Y. Wang, X. Zhang. Autonomous garbage detection for intelligent urban management. :12-14. DOI: 10.1016/s0300-483x(00)00452-2.
[44] S. Ding, K. Zhao. Research on daily objects detection based on deep neural network. :28-29. DOI: 10.1016/s0300-483x(00)00452-2.
[45] S. Ren, K. He, R. Girshick, J. Sun. et al.(2016). Faster R-CNN: towards real-time object detection with region proposal networks. IEEE Transactions on Pattern Analysis and Machine Intelligence.39(6):1137-1149. DOI: 10.1016/s0300-483x(00)00452-2.
文献评价指标
浏览 4次
下载全文 0次
评分次数 0次
用户评分 0.0分
分享 0次