Papers
Topics
Authors
Recent
Search
2000 character limit reached

LeTac-MPC: Learning Model Predictive Control for Tactile-reactive Grasping

Published 7 Mar 2024 in cs.RO and cs.AI | (2403.04934v2)

Abstract: Grasping is a crucial task in robotics, necessitating tactile feedback and reactive grasping adjustments for robust grasping of objects under various conditions and with differing physical properties. In this paper, we introduce LeTac-MPC, a learning-based model predictive control (MPC) for tactile-reactive grasping. Our approach enables the gripper to grasp objects with different physical properties on dynamic and force-interactive tasks. We utilize a vision-based tactile sensor, GelSight, which is capable of perceiving high-resolution tactile feedback that contains information on the physical properties and states of the grasped object. LeTac-MPC incorporates a differentiable MPC layer designed to model the embeddings extracted by a neural network (NN) from tactile feedback. This design facilitates convergent and robust grasping control at a frequency of 25 Hz. We propose a fully automated data collection pipeline and collect a dataset only using standardized blocks with different physical properties. However, our trained controller can generalize to daily objects with different sizes, shapes, materials, and textures. The experimental results demonstrate the effectiveness and robustness of the proposed approach. We compare LeTac-MPC with two purely model-based tactile-reactive controllers (MPC and PD) and open-loop grasping. Our results show that LeTac-MPC has optimal performance in dynamic and force-interactive tasks and optimal generalizability. We release our code and dataset at https://github.com/ZhengtongXu/LeTac-MPC.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (48)
  1. W. Yuan, S. Dong, and E. H. Adelson, “GelSight: High-resolution robot tactile sensors for estimating geometry and force,” Sensors, vol. 17, no. 12, p. 2762, 2017.
  2. S. Luo, W. Yuan, E. Adelson, A. G. Cohn, and R. Fuentes, “ViTac: Feature sharing between vision and tactile sensing for cloth texture recognition,” in Proc. IEEE Int. Conf. Robot. Autom., 2018, pp. 2722–2727.
  3. B. Ward-Cherrier, N. Pestell, and N. F. Lepora, “NeuroTac: A neuromorphic optical tactile sensor applied to texture recognition,” in Proc. IEEE Int. Conf. Robot. Autom., 2020, pp. 2654–2660.
  4. M. Kaboli and G. Cheng, “Robust tactile descriptors for discriminating objects from textural properties via artificial robotic skin,” IEEE Trans. Robot., vol. 34, no. 4, pp. 985–1003, 2018.
  5. B. Omarali, F. Palermo, K. Althoefer, M. Valle, and I. Farkhatdinov, “Tactile classification of object materials for virtual reality based robot teleoperation,” in Proc. IEEE Int. Conf. Robot. Autom., 2022, pp. 9288–9294.
  6. S. Luo, J. Bimbo, R. Dahiya, and H. Liu, “Robotic tactile perception of object properties: A review,” Mechatronics, vol. 48, pp. 54–67, 2017.
  7. J. Yin, G. M. Campbell, J. Pikul, and M. Yim, “Multimodal proximity and visuotactile sensing with a selectively transmissive soft membrane,” in Proc. IEEE Int. Conf. Soft Robotics, 2022, pp. 802–808.
  8. P. K. Murali, C. Wang, D. Lee, R. Dahiya, and M. Kaboli, “Deep active cross-modal visuo-tactile transfer learning for robotic object recognition,” IEEE Robot. Autom. Lett., vol. 7, no. 4, pp. 9557–9564, 2022.
  9. W. Yuan, C. Zhu, A. Owens, M. A. Srinivasan, and E. H. Adelson, “Shape-independent hardness estimation using deep learning and a GelSight tactile sensor,” in Proc. IEEE Int. Conf. Robot. Autom., 2017, pp. 951–958.
  10. S. Dong, D. Ma, E. Donlon, and A. Rodriguez, “Maintaining grasps within slipping bounds by monitoring incipient slip,” in Proc. IEEE Int. Conf. Robot. Autom., 2019, pp. 3818–3824.
  11. F. R. Hogan, J. Ballester, S. Dong, and A. Rodriguez, “Tactile Dexterity: Manipulation primitives with tactile feedback,” in Proc. IEEE Int. Conf. Robot. Autom., 2020, pp. 8863–8869.
  12. C. Lin, Z. Lin, S. Wang, and H. Xu, “DTact: A vision-based tactile sensor that measures high-resolution 3D geometry directly from darkness,” arXiv preprint arXiv:2209.13916, 2022.
  13. G. Zhang, Y. Du, H. Yu, and M. Y. Wang, “DelTact: A vision-based tactile sensor using a dense color pattern,” IEEE Robot. Autom. Lett., vol. 7, no. 4, pp. 10 778–10 785, 2022.
  14. E. Donlon, S. Dong, M. Liu, J. Li, E. Adelson, and A. Rodriguez, “GelSlim: A high-resolution, compact, robust, and calibrated tactile-sensing finger,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2018, pp. 1927–1934.
  15. Y. She, S. Wang, S. Dong, N. Sunil, A. Rodriguez, and E. Adelson, “Cable manipulation with a tactile-reactive gripper,” Int. J. Robot. Res., vol. 40, no. 12-14, pp. 1385–1401, 2021.
  16. Y. Shirai, D. K. Jha, A. U. Raghunathan, and D. Hong, “Tactile tool manipulation,” arXiv preprint arXiv:2301.06698, 2023.
  17. S. Kim and A. Rodriguez, “Active extrinsic contact sensing: Application to general peg-in-hole insertion,” in Proc. IEEE Int. Conf. Robot. Autom., 2022, pp. 10 241–10 247.
  18. N. Sunil, S. Wang, Y. She, E. Adelson, and A. R. Garcia, “Visuotactile affordances for cloth manipulation with local control,” in Proc. Conf. Robot Learn., 2022.
  19. J. Lloyd and N. F. Lepora, “Goal-driven robotic pushing using tactile and proprioceptive feedback,” IEEE Trans. Robot., vol. 38, no. 2, pp. 1201–1212, 2021.
  20. Y. Zheng, F. F. Veiga, J. Peters, and V. J. Santos, “Autonomous learning of page flipping movements via tactile feedback,” IEEE Trans. Robot., vol. 38, no. 5, pp. 2734–2749, 2022.
  21. M. Oller, M. P. i Lisbona, D. Berenson, and N. Fazeli, “Manipulation via Membranes: High-resolution and highly deformable tactile sensing and control,” in Proc. Conf. Robot Learn., 2022.
  22. S. Tian, F. Ebert, D. Jayaraman, M. Mudigonda, C. Finn, R. Calandra, and S. Levine, “Manipulation by Feel: Touch-based control with deep predictive models,” in Proc. IEEE Int. Conf. Robot. Autom., 2019, pp. 818–824.
  23. R. Calandra, A. Owens, M. Upadhyaya, W. Yuan, J. Lin, E. H. Adelson, and S. Levine, “The Feeling of Success: Does touch sensing help predict grasp outcomes?” in Proc. Conf. Robot Learn., 2017, pp. 314–323.
  24. Z. Si, Z. Zhu, A. Agarwal, S. Anderson, and W. Yuan, “Grasp stability prediction with sim-to-real transfer from tactile sensing,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2022, pp. 7809–7816.
  25. S. Kanitkar, H. Jiang, and W. Yuan, “PoseIt: A visual-tactile dataset of holding poses for grasp stability analysis,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2022, pp. 71–78.
  26. G. Yan, A. Schmitz, S. Funabashi, S. Somlor, T. P. Tomo, and S. Sugano, “A robotic grasping state perception framework with multi-phase tactile information and ensemble learning,” IEEE Robot. Autom. Lett., 2022.
  27. R. Calandra, A. Owens, D. Jayaraman, J. Lin, W. Yuan, J. Malik, E. H. Adelson, and S. Levine, “More Than a Feeling: Learning to grasp and regrasp using vision and touch,” IEEE Robot. Autom. Lett., vol. 3, no. 4, pp. 3300–3307, 2018.
  28. F. R. Hogan, M. Bauza, O. Canal, E. Donlon, and A. Rodriguez, “Tactile Regrasp: Grasp adjustments via simulated tactile transformations,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2018, pp. 2963–2970.
  29. Y. Han, R. Batra, N. Boyd, T. Zhao, Y. She, S. Hutchinson, and Y. Zhao, “Learning generalizable vision-tactile robotic grasping strategy for deformable objects via transformer,” arXiv preprint arXiv:2112.06374, 2021.
  30. Q. Feng, Z. Chen, J. Deng, C. Gao, J. Zhang, and A. Knoll, “Center-of-mass-based robust grasp planning for unknown objects using tactile-visual sensors,” in Proc. IEEE Int. Conf. Robot. Autom., 2020, pp. 610–617.
  31. J. W. James and N. F. Lepora, “Slip detection for grasp stabilization with a multifingered tactile robot hand,” IEEE Trans. Robot., vol. 37, no. 2, pp. 506–519, 2020.
  32. R. Kolamuri, Z. Si, Y. Zhang, A. Agarwal, and W. Yuan, “Improving grasp stability with rotation measurement from tactile sensing,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2021, pp. 6809–6816.
  33. C. Wang, S. Wang, B. Romero, F. Veiga, and E. Adelson, “SwingBot: Learning physical features from in-hand tactile exploration for dynamic swing-up manipulation,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2020, pp. 5633–5640.
  34. A. Zeng, S. Song, J. Lee, A. Rodriguez, and T. Funkhouser, “TossingBot: Learning to throw arbitrary objects with residual physics,” IEEE Trans. Robot., vol. 36, no. 4, pp. 1307–1319, 2020.
  35. C. Chi, B. Burchfiel, E. Cousineau, S. Feng, and S. Song, “Iterative Residual Policy: for goal-conditioned dynamic manipulation of deformable objects,” in in Proc. of Robot.: Sci. and Syst, 2022.
  36. Y. Li, S. Li, V. Sitzmann, P. Agrawal, and A. Torralba, “3D neural scene representations for visuomotor control,” in Proc. Conf. Robot Learn., 2022, pp. 112–123.
  37. D. Driess, Z. Huang, Y. Li, R. Tedrake, and M. Toussaint, “Learning multi-object dynamics with compositional neural radiance fields,” in Proc. Conf. Robot Learn., 2022, pp. 1755–1768.
  38. M. Yu, K. Lv, H. Zhong, S. Song, and X. Li, “Global model learning for large deformation control of elastic deformable linear objects: An efficient and adaptive approach,” IEEE Trans. Robot., 2022.
  39. T. Bi, C. Sferrazza, and R. D’Andrea, “Zero-shot sim-to-real transfer of tactile control policies for aggressive swing-up manipulation,” IEEE Robot. Autom. Lett., vol. 6, no. 3, pp. 5761–5768, 2021.
  40. W. Yuan, Y. Mo, S. Wang, and E. H. Adelson, “Active clothing material perception using tactile sensing and deep learning,” in Proc. IEEE Int. Conf. Robot. Autom., 2018, pp. 4842–4849.
  41. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2016, pp. 770–778.
  42. B. Amos and J. Z. Kolter, “OptNet: Differentiable optimization as a layer in neural networks,” in Proc. 34th Int. Conf. Mach. Learn., 2017, pp. 136–145.
  43. https://s3.amazonaws.com/mf.product.doc.images/Datasheets/Material+Datasheets/CompositesMaterialDatasheet.pdf.
  44. https://static1.squarespace.com/static/5b1ecdd4372b96e84200cf1d/t/5b608e1a758d46a989c0bf93/1533054492488/XP-565+%281%29.pdf.
  45. https://www.cati.com/blog/convert-durometer-to-youngs-modulus/.
  46. D. Ma, E. Donlon, S. Dong, and A. Rodriguez, “Dense tactile force estimation using gelslim and inverse FEM,” in Proc. IEEE Int. Conf. Robot. Autom., 2019, pp. 5418–5424.
  47. B. Stellato, G. Banjac, P. Goulart, A. Bemporad, and S. Boyd, “OSQP: an operator splitting solver for quadratic programs,” Mathematical Programming Computation, vol. 12, no. 4, pp. 637–672, 2020. [Online]. Available: https://doi.org/10.1007/s12532-020-00179-2
  48. S. Wang, Y. She, B. Romero, and E. Adelson, “GelSight Wedge: Measuring high-resolution 3D contact geometry with a compact robot finger,” in Proc. IEEE Int. Conf. Robot. Autom., 2021, pp. 6468–6475.
Citations (4)

Summary

  • The paper presents a novel framework that fuses a differentiable MPC layer with neural tactile feedback to enable adaptive and low-force grasping.
  • It employs an automated dual-arm data collection system using high-resolution GelSight sensors to train and generalize tactile-reactive control.
  • Extensive experiments show that LeTac-MPC outperforms conventional control methods in dynamic, force-interactive tasks with diverse objects.

Learning Model Predictive Control for Tactile-reactive Grasping: A Review of LeTac-MPC

The paper presents LeTac-MPC, a learning-based model predictive control (MPC) paradigm for tactile-reactive grasping, which serves as a reputable contribution to the field of robotic manipulation. This is achieved through a distinct combination of neural networks (NN) with a differentiable MPC layer to facilitate the extraction and representation of tactile feedback for proficient grasp control.

The proposed framework aims to recognize and adapt to different physical properties of objects during manipulation—a capability deemed essential for efficient robotic grasping. The study leverages high-resolution tactile data sourced from the GelSight sensor, enabling robust grasping of objects with diverse physical characteristics, shapes, sizes, and surface textures. Below is an evaluative assessment of the core contributions and implications stemming from this research.

Contributions and Methodology

  1. Differentiable MPC Layer: At the heart of the paper is the differentiable MPC layer, which distinguishes this approach from traditional control methods. This layer is designed to work in conjunction with a convolutional neural network (CNN). The embedding extracted by the NN serves as input for the MPC layer, which determines the optimal control actions ensuring robust grasping. This approach offers a synergy of learning-based and model-based method strengths, particularly in addressing grasping tasks with stringent real-time demands.
  2. Automated Data Collection: The paper introduces a fully automated pipeline for data collection, utilizing a dual-arm robotic setup. This innovation simplifies the process of acquiring relevant tactile feedback necessary for training, thereby enhancing the model's ability to generalize effectively to everyday objects despite being trained on data from standardized blocks.
  3. Generalization and Robustness: The authors demonstrate that LeTac-MPC exhibits significant generalization capabilities, allowing it to manage daily objects with different physical properties, shapes, and surface textures. In experimental trials, LeTac-MPC outperformed purely model-based methods and open-loop systems in dynamic manipulation scenarios and force-interactive tasks.
  4. Comparative Analysis: Extensive experimental analyses were undertaken, contrasting LeTac-MPC against PD control, model-based MPC, and open-loop grasping. Among these, LeTac-MPC showcased superior performance, evidenced by its ability to better handle dynamic shaking and obstacle collision scenarios. Importantly, a compelling advantage highlighted in the study is the ability to achieve stable grasps with less force, attenuating potential damage to delicate items.

Implications and Future Directions

The LeTac-MPC framework signifies an advance in addressing challenges associated with real-time tactile-reactive grasping by unifying complex tactile signal processing within a coherent control strategy. The implications are manifold:

  1. Enhanced Grasp Stability: By employing tactile feedback for real-time control, robots can achieve a higher degree of grasp stability even when subjected to rapid state changes or external disturbances.
  2. Real-time Integration of High-dimensional Feedback: LeTac-MPC's integration of high-resolution tactile feedback into control loops satisfies the demand for real-time control in dynamic environments, thereby offering a scalable solution for broader manipulation tasks.
  3. Potential Application Areas: Given its adaptability, LeTac-MPC can facilitate advancements across domains requiring delicate manipulation or variable object interaction, including agriculture, logistics, and even teleoperation in hazardous environments.
  4. Future Developments: Prospective research could extend this approach to unify visual and tactile data, leveraging the strengths of both sensory modalities. Additionally, more sophisticated nonlinear optimization within the differentiable layer could enhance its capacity to manage more complex manipulation tasks.

In summary, this paper introduces an innovative approach to tactile-reactive control that promises to bring robotic manipulation closer to human-like adaptability and learning efficiency. While the presented outcomes are promising, the exploration of seamless integration with other sensory inputs and expanded datasets may yield further improvements in both robustness and generalization for diverse tasks.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 3 tweets with 119 likes about this paper.