Papers
Topics
Authors
Recent
Search
2000 character limit reached

Towards vision-based dual arm robotic fruit harvesting

Published 14 Jun 2023 in cs.RO | (2306.08729v2)

Abstract: Interest in agricultural robotics has increased considerably in recent years due to benefits such as improvement in productivity and labor reduction. However, current problems associated with unstructured environments make the development of robotic harvesters challenging. Most research in agricultural robotics focuses on single arm manipulation. Here, we propose a dual-arm approach. We present a dual-arm fruit harvesting robot equipped with a RGB-D camera, cutting and collecting tools. We exploit the cooperative task description to maximize the capabilities of the dual-arm robot. We designed a Hierarchical Quadratic Programming based control strategy to fulfill the set of hard constrains related to the robot and environment: robot joint limits, robot self-collisions, robot-fruit and robot-tree collisions. We combine deep learning and standard image processing algorithms to detect and track fruits as well as the tree trunk in the scene. We validate our perception methods on real-world RGB-D images and our control method on simulated experiments.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (28)
  1. E. Karan and S. Asgari, “Resilience of food, energy, and water systems to a sudden labor shortage,” Environment Systems and Decisions, vol. 41, no. 1, pp. 63–81, 2021.
  2. G. Kootstra, X. Wang, P. M. Blok, J. Hemming, and E. Van Henten, “Selective harvesting robotics: current research, trends, and future directions,” Current Robotics Reports, vol. 2, no. 1, pp. 95–104, 2021.
  3. P. Chiacchio, S. Chiaverini, and B. Siciliano, “Direct and Inverse Kinematics for Coordinated Motion Tasks of a Two-Manipulator System,” Journal of Dynamic Systems, Measurement, and Control, vol. 118, no. 4, pp. 691–697, 12 1996.
  4. A. Cherubini, R. Passama, B. Navarro, M. Sorour, A. Khelloufi, O. Mazhar, S. Tarbouriech, J. Zhu, O. Tempier, A. Crosnier, et al., “A collaborative robot for the factory of the future: Bazar,” The International Journal of Advanced Manufacturing Technology, vol. 105, no. 9, pp. 3643–3659, 2019.
  5. J. R. Davidson, C. J. Hohimer, C. Mo, and M. Karkee, “Dual robot coordination for apple harvesting,” in 2017 ASABE annual international meeting.   American Society of Agricultural and Biological Engineers, 2017, p. 1.
  6. D. SepúLveda, R. Fernández, E. Navas, M. Armada, and P. González-De-Santos, “Robotic aubergine harvesting using dual-arm manipulation,” IEEE Access, vol. 8, pp. 121 889–121 904, 2020.
  7. S. Mehta and T. Burks, “Vision-based control of robotic manipulator for citrus harvesting,” Computers and Electronics in Agriculture, vol. 102, pp. 146–158, 2014.
  8. E. J. Van Henten, J. Hemming, B. Van Tuijl, J. Kornet, J. Meuleman, J. Bontsema, and E. Van Os, “An autonomous robot for harvesting cucumbers in greenhouses,” Autonomous robots, vol. 13, no. 3, pp. 241–258, 2002.
  9. X. Ling, Y. Zhao, L. Gong, C. Liu, and T. Wang, “Dual-arm cooperation and implementing for robotic harvesting tomato using binocular vision,” Robotics and Autonomous Systems, vol. 114, pp. 134–143, 2019.
  10. S. Hayashi, K. Shigematsu, S. Yamamoto, K. Kobayashi, Y. Kohno, J. Kamata, and M. Kurita, “Evaluation of a strawberry-harvesting robot in a field test,” Biosystems engineering, vol. 105, no. 2, pp. 160–171, 2010.
  11. C. Lehnert, A. English, C. McCool, A. W. Tow, and T. Perez, “Autonomous sweet pepper harvesting for protected cropping systems,” IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 872–879, 2017.
  12. T. T. Santos, L. L. de Souza, A. A. dos Santos, and S. Avila, “Grape detection, segmentation, and tracking using deep neural networks and three-dimensional association,” Computers and Electronics in Agriculture, vol. 170, p. 105247, 2020.
  13. H. Kang and C. Chen, “Fast implementation of real-time fruit detection in apple orchards using deep learning,” Computers and Electronics in Agriculture, vol. 168, p. 105108, 2020.
  14. A. Koirala, K. Walsh, Z. Wang, and C. McCarthy, “Deep learning for real-time fruit detection and orchard fruit load estimation: Benchmarking of ‘mangoyolo’,” Precision Agriculture, vol. 20, no. 6, pp. 1107–1135, 2019.
  15. Y.-P. Liu, C.-H. Yang, H. Ling, S. Mabu, and T. Kuremoto, “A visual system of citrus picking robot using convolutional neural networks,” in 2018 5th International Conference on Systems and Informatics (ICSAI).   IEEE, 2018, pp. 344–349.
  16. R. Kirk, G. Cielniak, and M. Mangan, “L* a* b* fruits: A rapid and robust outdoor fruit detection system combining bio-inspired features with one-stage deep learning networks,” Sensors, vol. 20, no. 1, p. 275, 2020.
  17. P. Ganesh, K. Volle, T. Burks, and S. Mehta, “Deep orange: Mask r-cnn based orange detection and segmentation,” IFAC-PapersOnLine, vol. 52, no. 30, pp. 70–75, 2019.
  18. P. Roy and V. Isler, “Surveying apple orchards with a monocular vision system,” in 2016 IEEE International Conference on Automation Science and Engineering (CASE).   IEEE, 2016, pp. 916–921.
  19. J. Das, G. Cross, C. Qu, A. Makineni, P. Tokekar, Y. Mulgaonkar, and V. Kumar, “Devices, systems, and methods for automated monitoring enabling precision agriculture,” in 2015 IEEE International Conference on Automation Science and Engineering (CASE).   IEEE, 2015, pp. 462–469.
  20. G. Jocher, A. Chaurasia, A. Stoken, J. Borovec, NanoCode012, Y. Kwon, TaoXie, K. Michael, J. Fang, imyhxy, Lorna, C. Wong, Z. Yifu, A. V, D. Montes, Z. Wang, C. Fati, J. Nadar, Laughing, UnglvKitDe, tkianai, yxNONG, P. Skalski, A. Hogan, M. Strobel, M. Jain, L. Mammana, and xylieong, “ultralytics/yolov5: v6.2 - YOLOv5 Classification Models, Apple M1, Reproducibility, ClearML and Deci.ai integrations,” Aug. 2022. [Online]. Available: https://doi.org/10.5281/zenodo.7002879
  21. T.-Y. Lin, M. Maire, S. Belongie, J. Hays, P. Perona, D. Ramanan, P. Dollár, and C. L. Zitnick, “Microsoft coco: Common objects in context,” in European Conference on Computer Vision (ECCV).   Springer, 2014, pp. 740–755.
  22. P. Pawara, E. Okafor, O. Surinta, L. Schomaker, and M. Wiering, “Comparing local descriptors and bags of visual words to deep convolutional neural networks for plant recognition,” in International Conference on Pattern Recognition Applications and Methods (ICPRAM), vol. 2.   SciTePress, 2017, pp. 479–486.
  23. N. Wojke, A. Bewley, and D. Paulus, “Simple online and realtime tracking with a deep association metric,” in 2017 IEEE International Conference on Image Processing (ICIP).   IEEE, 2017, pp. 3645–3649.
  24. A. Bewley, Z. Ge, L. Ott, F. Ramos, and B. Upcroft, “Simple online and realtime tracking,” in 2016 IEEE International Conference on Image Processing (ICIP).   IEEE, 2016, pp. 3464–3468.
  25. B. V. Adorno, P. Fraisse, and S. Druon, “Dual position control strategies using the cooperative dual task-space framework,” in 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2010, pp. 3955–3960.
  26. S. Tarbouriech, B. Navarro, P. Fraisse, A. Crosnier, A. Cherubini, and D. Sallé, “An admittance based hierarchical control framework for dual-arm cobots,” Mechatronics, vol. 86, p. 102814, 2022.
  27. B. Faverjon and P. Tournassoud, “A local based approach for path planning of manipulators with a high number of degrees of freedom,” in Proceedings. 1987 IEEE International Conference on Robotics and Automation (ICRA), vol. 4.   IEEE, 1987, pp. 1152–1159.
  28. K. Zhou, Y. Yang, A. Cavallaro, and T. Xiang, “Omni-scale feature learning for person re-identification,” in Proceedings of the IEEE/CVF international conference on computer vision, 2019, pp. 3702–3712.
Citations (2)

Summary

  • The paper introduces a novel dual-arm system integrating computer vision and HQP control for precise, collision-free fruit harvesting.
  • It employs deep learning (YOLOv5, DeepSORT) for fruit detection and tracking, achieving 100% accuracy in non-occluded images.
  • Simulation and laboratory experiments validate the system’s efficient dual-arm coordination and robust obstacle avoidance.

Vision-Based Dual Arm Robotic Fruit Harvesting System

This paper introduces a dual-arm robotic system for autonomous fruit harvesting, addressing challenges in unstructured agricultural environments. The system integrates computer vision for fruit detection and tracking with a hierarchical quadratic programming (HQP) based control strategy for precise and collision-free dual-arm manipulation. The authors validate their approach through real-world image data and simulated experiments.

System Architecture and Components

The robotic harvesting system comprises the following key components:

  • Dual-Arm Robot: A dual-arm robot (BAZAR) equipped with cutting and collecting tools, designed for cooperative manipulation.
  • RGB-D Camera: An RGB-D camera for real-time fruit detection, tracking, and scene understanding.
  • Perception Module: A perception system that combines deep learning (YOLOv5, DeepSORT) and standard image processing techniques for fruit localization, tracking, and tree trunk estimation.
  • Motion Control Module: An HQP-based control strategy for dual-arm coordination, collision avoidance, and precise manipulation.

Perception and Scene Understanding

The perception module performs three primary tasks:

  • Fruit Localization: Utilizes YOLOv5 for fruit detection and bounding box estimation. DeepSORT is employed for tracking fruits across successive frames. 3D spatial positions of fruits are computed by incorporating depth information from the RGB-D camera. The authors assume fruits are ellipsoids to simplify the localization process.
  • Fruit Selection: Determines the optimal fruit for harvesting based on its distance to the camera's optical center. A list of fruit IDs and their corresponding distances is maintained and updated to ensure robustness against temporary occlusions.
  • Tree Localization: Estimates the tree trunk position using a combination of HSV color space segmentation and depth masking. This method avoids complex machine learning models and is computationally efficient.

Motion Control and Task Planning

The motion control module employs an HQP-based control strategy to manage the dual-arm system. The authors use a cooperative task representation that divides the harvesting task into absolute (cutting) and relative (collecting) components. Obstacle avoidance is achieved by modeling the tree trunk as a cylinder and fruits as ellipsoids. The control system incorporates a velocity damper to ensure smooth and collision-free movements.

Experimental Results

The authors conducted experiments in a laboratory setting using artificial orange trees. The fruit detection and tracking strategies were tested in real-time using an RGB-D camera. A simulation environment (CoppeliaSim) was created to validate the motion control strategy. The system achieved correct fruit ID assignment in 83% of occluded images and 100% accuracy in non-occluded images. The robot successfully harvested detected fruits in the simulation environment, demonstrating the effectiveness of the dual-arm coordination and collision avoidance mechanisms.

Conclusion

The research presents a functional dual-arm autonomous harvesting system. The authors highlight the successful integration of deep learning-based perception with HQP-based motion control for robotic fruit harvesting. The experimental results demonstrate the potential of the system for real-world applications. Future work involves refining the tracking algorithm, improving peduncle detection, and developing more robust cutting and collecting strategies.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Collections

Sign up for free to add this paper to one or more collections.