DEVELOPMENT OF AN AUTONOMOUS NAVIGATION AGRICULTURAL ROBOTIC PLATFORM BASED ON MACHINE VISION

Document Type : Original Article

Authors

1 Prof., Ag. Eng., Dept., Fac. of Ag., Kafrelsheikh Univ. Egypt.

2 Assist. Prof., Ag. Eng. Dept., Fac. of Ag., Kafrelsheikh Univ., Egypt.

3 Head Researcher, Ag. Eng. Res. Inst., Ag. Res. Center, Dokki, Giza, Egypt.

4 Engineer, Plant Protection Res. Inst. (PPRI), Ag. Res. Center, Dokki, Giza, Egypt.

Abstract

In Egypt, management of crops still away from what is being used today by utilizing the advances of mechanical design capabilities, sensing and electronics technology. These technologies have been introduced in many places and recorded high accuracy in different field operations. So, an autonomous agricultural robotic platform (ARP) based on machine vision has been developed and constructed. The ARP consisted of two main parts namely; 1) Power transmission and auto-guide system; and 2) Robotic platform. The experiments were carried out at department of agricultural engineering, faculty of agriculture, Kafrelsheikh University during 2014-2015. In this study, the experiments were conducted in laboratory to optimize the accuracy of ARP control using machine vision in term of the autonomous navigation and performance of the robot’s guidance system. For evaluation the image processing technique, four different camera resolutions (1080×1920, 1944×2592, 2736×3648, and 3240×4320 pixels) and three camera' heights (500, 700 and 1000 mm) have been used  to measure the execution time for image processing steps. Flight time of spray droplets has been calculated under three levels of spray height, (70, 100 and 130 mm), three levels of spray pressure (1, 3 and 5 bar) and three levels of nozzle size, (1.5, 3 and 5 mm). Also, the effect of changing duty cycle percentage (DC, %) has been studied to control the speed of the ARP with Pulse Width Modulation (PWM) signals. Different nozzle tip sizes and spray pressures have been used to trace the flow rate variation. Based on the total time consumed in the execution time and droplets flight time, speed of the ARP has been noted according to the resolution of camera and its height levels. Results showed that the robotic platform' guidance system with machine vision was able to adequately distinguish the path, resist image noise and give less lateral offset error than the human operators.The average lateral error of autonomous was 2.75, 19.33, 21.22, 34.18 and16.69 mm, while the average lateral error of human operator was 32.70, 4.85, 7.85, 38.35 and 14.75 mm for straight path, curved path, sine wave path, offset discontinuity, and angle discontinuity respectively. The best execution time of image processing was obtained with the minimum values of the camera resolution at 500 mm camera height. While, increasing the size of nozzle at same height and spray pressure decreased the flight time. The favorable robotic platform' speeds were obtained at lower values of camera resolutions and wider distances between nozzle and camera.

Keywords


ASAE Standard (1998). Calibration and Distribution Pattern Testing of Agriculture Arial Application Equipment. Ag. Eng., yearbook: pp.244-247.
Astrand B. and A. J.  Baerveldt (2005). A vision based row-following system for agricultural machinery.  Mechatronics, 15(2): 251-269.
Ayala, M.; C. Soria and R. Carelli (2008). Visual servo control of a mobile robot in agriculture environments, Mechanics Based Design of Structures and Machines, 36: 392-410.
Barawid Jr, O.; A. Mizushima; K. Ishii and N. Noguchi (2007). Development of an autonomous navigation system using a two-dimensional laser scanner in an orchard application, Biosystems Engineering, 96: 139-149.
Blackmore, B. S.; H. Griepentrog; S. Fountas and T. Gemtos (2007). A Specification for an autonomous crop production mechanization system. Agric Eng Int: CIGR Ejournal; 9:6–32.
Blasco, J.; N. Aleixos; J. M. Roger; G. Rabatel and E. Molto (2002). Robotic weed control using machine vision. Biosystems Engineering, 83(2), 149–157.
Cho, S. I.; D. S. Lee and J. Y. Jeong (2002). Weed-plant discrimination by machine vision and artificial neural network. Biosystems Eng 82(3): 275-280.
Forsyth, D. A. and J. Ponce (2003). Computer Vision- A modern approach. Prentice-Hall, New Jersey.
Gao, F.; Y. Xun; J. Wu; G. Bao and Y. Tan (2010). Navigation line detection based on robotic vision in natural vegetation-embraced environment, 2010 3rd International Congress on Image and Signal Processing (CISP) p. 2596-2600.
Hamner, B.; S. Singh and M. Bergerman (2010). Improving orchard efficiency with autonomous utility vehicles, 2010 ASABE Annual International Meeting, Pittsburgh, PA, Paper Number 1009415.
Hloben, P. (2007). Study on the response time of direct injection systems for variable rate application of herbicides. Ph. D. Thesis. Bonn University, Germany.
Ji, R. and L. Qi (2011). Crop-row detection algorithm based on random Hough transformation. Mathematical and Computer Modelling, 54, 1016–1020.
Kiani, S. and A. Jafari (2012). Crop Detection and Positioning in the Field Using Discriminant Analysis and Neural Networks Based on Shape Features, J. Agr. Sci. Tech. Vol. 14: 755-765
Lee, W. S. (1998). Robotic weed control system for tomatoes. Ph.D. thesis, University of California, USA.
Meyer, G. E. and J. C. Neto. (2008). Verification of color vegetation indices for automated crop imaging applications. Computer and electronics in agriculture 63(2): 282-293.
Patel, K. B.; M. B. Zalte and S. R. Panchal (2013). A Review: Machine vision and its Applications. Journal of Electronics and Communication Engineering (IOSR-JECE), Vol. 7 (5): 72-77.
Sick ivp (2006). Machine vision introduction. Version 2.2 https://www.sick.com/medias/Machine-Vision-Introduction2-2-web.pdf.
Spencer, S. R. (2004). Development of an Autonomous Vehicle for use in Agriculture. M. Sc. Thesis, Biological engineering Department. The Graduate Faculty of North Carolina State University.
Subramanian, V. (2005). Autonomous vehicle guidance using machine vision and laser radar for agricultural applications. M. Sc. Thesis. Agricultural and Biological Engineering Department. University of Florida, U.S.A.
Torres-Sospedra, J. and P. Nebot (2011). A new approach to visual-based sensory system for navigation into orange groves, Sensors, 11: 4086-4103.
Xue, J.; L. Zhang and T. E. Grift (2012). Variable field-of-view machine vision based row guidance of an agricultural robot. Computers and Electronics in Agriculture, 84: 85–91.