LeTac-MPC: Learning Model Predictive Control for Tactile-reactive Grasping
Abstract: Grasping is a crucial task in robotics, necessitating tactile feedback and reactive grasping adjustments for robust grasping of objects under various conditions and with differing physical properties. In this paper, we introduce LeTac-MPC, a learning-based model predictive control (MPC) for tactile-reactive grasping. Our approach enables the gripper to grasp objects with different physical properties on dynamic and force-interactive tasks. We utilize a vision-based tactile sensor, GelSight, which is capable of perceiving high-resolution tactile feedback that contains information on the physical properties and states of the grasped object. LeTac-MPC incorporates a differentiable MPC layer designed to model the embeddings extracted by a neural network (NN) from tactile feedback. This design facilitates convergent and robust grasping control at a frequency of 25 Hz. We propose a fully automated data collection pipeline and collect a dataset only using standardized blocks with different physical properties. However, our trained controller can generalize to daily objects with different sizes, shapes, materials, and textures. The experimental results demonstrate the effectiveness and robustness of the proposed approach. We compare LeTac-MPC with two purely model-based tactile-reactive controllers (MPC and PD) and open-loop grasping. Our results show that LeTac-MPC has optimal performance in dynamic and force-interactive tasks and optimal generalizability. We release our code and dataset at https://github.com/ZhengtongXu/LeTac-MPC.
- W. Yuan, S. Dong, and E. H. Adelson, “GelSight: High-resolution robot tactile sensors for estimating geometry and force,” Sensors, vol. 17, no. 12, p. 2762, 2017.
- S. Luo, W. Yuan, E. Adelson, A. G. Cohn, and R. Fuentes, “ViTac: Feature sharing between vision and tactile sensing for cloth texture recognition,” in Proc. IEEE Int. Conf. Robot. Autom., 2018, pp. 2722–2727.
- B. Ward-Cherrier, N. Pestell, and N. F. Lepora, “NeuroTac: A neuromorphic optical tactile sensor applied to texture recognition,” in Proc. IEEE Int. Conf. Robot. Autom., 2020, pp. 2654–2660.
- M. Kaboli and G. Cheng, “Robust tactile descriptors for discriminating objects from textural properties via artificial robotic skin,” IEEE Trans. Robot., vol. 34, no. 4, pp. 985–1003, 2018.
- B. Omarali, F. Palermo, K. Althoefer, M. Valle, and I. Farkhatdinov, “Tactile classification of object materials for virtual reality based robot teleoperation,” in Proc. IEEE Int. Conf. Robot. Autom., 2022, pp. 9288–9294.
- S. Luo, J. Bimbo, R. Dahiya, and H. Liu, “Robotic tactile perception of object properties: A review,” Mechatronics, vol. 48, pp. 54–67, 2017.
- J. Yin, G. M. Campbell, J. Pikul, and M. Yim, “Multimodal proximity and visuotactile sensing with a selectively transmissive soft membrane,” in Proc. IEEE Int. Conf. Soft Robotics, 2022, pp. 802–808.
- P. K. Murali, C. Wang, D. Lee, R. Dahiya, and M. Kaboli, “Deep active cross-modal visuo-tactile transfer learning for robotic object recognition,” IEEE Robot. Autom. Lett., vol. 7, no. 4, pp. 9557–9564, 2022.
- W. Yuan, C. Zhu, A. Owens, M. A. Srinivasan, and E. H. Adelson, “Shape-independent hardness estimation using deep learning and a GelSight tactile sensor,” in Proc. IEEE Int. Conf. Robot. Autom., 2017, pp. 951–958.
- S. Dong, D. Ma, E. Donlon, and A. Rodriguez, “Maintaining grasps within slipping bounds by monitoring incipient slip,” in Proc. IEEE Int. Conf. Robot. Autom., 2019, pp. 3818–3824.
- F. R. Hogan, J. Ballester, S. Dong, and A. Rodriguez, “Tactile Dexterity: Manipulation primitives with tactile feedback,” in Proc. IEEE Int. Conf. Robot. Autom., 2020, pp. 8863–8869.
- C. Lin, Z. Lin, S. Wang, and H. Xu, “DTact: A vision-based tactile sensor that measures high-resolution 3D geometry directly from darkness,” arXiv preprint arXiv:2209.13916, 2022.
- G. Zhang, Y. Du, H. Yu, and M. Y. Wang, “DelTact: A vision-based tactile sensor using a dense color pattern,” IEEE Robot. Autom. Lett., vol. 7, no. 4, pp. 10 778–10 785, 2022.
- E. Donlon, S. Dong, M. Liu, J. Li, E. Adelson, and A. Rodriguez, “GelSlim: A high-resolution, compact, robust, and calibrated tactile-sensing finger,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2018, pp. 1927–1934.
- Y. She, S. Wang, S. Dong, N. Sunil, A. Rodriguez, and E. Adelson, “Cable manipulation with a tactile-reactive gripper,” Int. J. Robot. Res., vol. 40, no. 12-14, pp. 1385–1401, 2021.
- Y. Shirai, D. K. Jha, A. U. Raghunathan, and D. Hong, “Tactile tool manipulation,” arXiv preprint arXiv:2301.06698, 2023.
- S. Kim and A. Rodriguez, “Active extrinsic contact sensing: Application to general peg-in-hole insertion,” in Proc. IEEE Int. Conf. Robot. Autom., 2022, pp. 10 241–10 247.
- N. Sunil, S. Wang, Y. She, E. Adelson, and A. R. Garcia, “Visuotactile affordances for cloth manipulation with local control,” in Proc. Conf. Robot Learn., 2022.
- J. Lloyd and N. F. Lepora, “Goal-driven robotic pushing using tactile and proprioceptive feedback,” IEEE Trans. Robot., vol. 38, no. 2, pp. 1201–1212, 2021.
- Y. Zheng, F. F. Veiga, J. Peters, and V. J. Santos, “Autonomous learning of page flipping movements via tactile feedback,” IEEE Trans. Robot., vol. 38, no. 5, pp. 2734–2749, 2022.
- M. Oller, M. P. i Lisbona, D. Berenson, and N. Fazeli, “Manipulation via Membranes: High-resolution and highly deformable tactile sensing and control,” in Proc. Conf. Robot Learn., 2022.
- S. Tian, F. Ebert, D. Jayaraman, M. Mudigonda, C. Finn, R. Calandra, and S. Levine, “Manipulation by Feel: Touch-based control with deep predictive models,” in Proc. IEEE Int. Conf. Robot. Autom., 2019, pp. 818–824.
- R. Calandra, A. Owens, M. Upadhyaya, W. Yuan, J. Lin, E. H. Adelson, and S. Levine, “The Feeling of Success: Does touch sensing help predict grasp outcomes?” in Proc. Conf. Robot Learn., 2017, pp. 314–323.
- Z. Si, Z. Zhu, A. Agarwal, S. Anderson, and W. Yuan, “Grasp stability prediction with sim-to-real transfer from tactile sensing,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2022, pp. 7809–7816.
- S. Kanitkar, H. Jiang, and W. Yuan, “PoseIt: A visual-tactile dataset of holding poses for grasp stability analysis,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2022, pp. 71–78.
- G. Yan, A. Schmitz, S. Funabashi, S. Somlor, T. P. Tomo, and S. Sugano, “A robotic grasping state perception framework with multi-phase tactile information and ensemble learning,” IEEE Robot. Autom. Lett., 2022.
- R. Calandra, A. Owens, D. Jayaraman, J. Lin, W. Yuan, J. Malik, E. H. Adelson, and S. Levine, “More Than a Feeling: Learning to grasp and regrasp using vision and touch,” IEEE Robot. Autom. Lett., vol. 3, no. 4, pp. 3300–3307, 2018.
- F. R. Hogan, M. Bauza, O. Canal, E. Donlon, and A. Rodriguez, “Tactile Regrasp: Grasp adjustments via simulated tactile transformations,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2018, pp. 2963–2970.
- Y. Han, R. Batra, N. Boyd, T. Zhao, Y. She, S. Hutchinson, and Y. Zhao, “Learning generalizable vision-tactile robotic grasping strategy for deformable objects via transformer,” arXiv preprint arXiv:2112.06374, 2021.
- Q. Feng, Z. Chen, J. Deng, C. Gao, J. Zhang, and A. Knoll, “Center-of-mass-based robust grasp planning for unknown objects using tactile-visual sensors,” in Proc. IEEE Int. Conf. Robot. Autom., 2020, pp. 610–617.
- J. W. James and N. F. Lepora, “Slip detection for grasp stabilization with a multifingered tactile robot hand,” IEEE Trans. Robot., vol. 37, no. 2, pp. 506–519, 2020.
- R. Kolamuri, Z. Si, Y. Zhang, A. Agarwal, and W. Yuan, “Improving grasp stability with rotation measurement from tactile sensing,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2021, pp. 6809–6816.
- C. Wang, S. Wang, B. Romero, F. Veiga, and E. Adelson, “SwingBot: Learning physical features from in-hand tactile exploration for dynamic swing-up manipulation,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2020, pp. 5633–5640.
- A. Zeng, S. Song, J. Lee, A. Rodriguez, and T. Funkhouser, “TossingBot: Learning to throw arbitrary objects with residual physics,” IEEE Trans. Robot., vol. 36, no. 4, pp. 1307–1319, 2020.
- C. Chi, B. Burchfiel, E. Cousineau, S. Feng, and S. Song, “Iterative Residual Policy: for goal-conditioned dynamic manipulation of deformable objects,” in in Proc. of Robot.: Sci. and Syst, 2022.
- Y. Li, S. Li, V. Sitzmann, P. Agrawal, and A. Torralba, “3D neural scene representations for visuomotor control,” in Proc. Conf. Robot Learn., 2022, pp. 112–123.
- D. Driess, Z. Huang, Y. Li, R. Tedrake, and M. Toussaint, “Learning multi-object dynamics with compositional neural radiance fields,” in Proc. Conf. Robot Learn., 2022, pp. 1755–1768.
- M. Yu, K. Lv, H. Zhong, S. Song, and X. Li, “Global model learning for large deformation control of elastic deformable linear objects: An efficient and adaptive approach,” IEEE Trans. Robot., 2022.
- T. Bi, C. Sferrazza, and R. D’Andrea, “Zero-shot sim-to-real transfer of tactile control policies for aggressive swing-up manipulation,” IEEE Robot. Autom. Lett., vol. 6, no. 3, pp. 5761–5768, 2021.
- W. Yuan, Y. Mo, S. Wang, and E. H. Adelson, “Active clothing material perception using tactile sensing and deep learning,” in Proc. IEEE Int. Conf. Robot. Autom., 2018, pp. 4842–4849.
- K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2016, pp. 770–778.
- B. Amos and J. Z. Kolter, “OptNet: Differentiable optimization as a layer in neural networks,” in Proc. 34th Int. Conf. Mach. Learn., 2017, pp. 136–145.
- https://s3.amazonaws.com/mf.product.doc.images/Datasheets/Material+Datasheets/CompositesMaterialDatasheet.pdf.
- https://static1.squarespace.com/static/5b1ecdd4372b96e84200cf1d/t/5b608e1a758d46a989c0bf93/1533054492488/XP-565+%281%29.pdf.
- https://www.cati.com/blog/convert-durometer-to-youngs-modulus/.
- D. Ma, E. Donlon, S. Dong, and A. Rodriguez, “Dense tactile force estimation using gelslim and inverse FEM,” in Proc. IEEE Int. Conf. Robot. Autom., 2019, pp. 5418–5424.
- B. Stellato, G. Banjac, P. Goulart, A. Bemporad, and S. Boyd, “OSQP: an operator splitting solver for quadratic programs,” Mathematical Programming Computation, vol. 12, no. 4, pp. 637–672, 2020. [Online]. Available: https://doi.org/10.1007/s12532-020-00179-2
- S. Wang, Y. She, B. Romero, and E. Adelson, “GelSight Wedge: Measuring high-resolution 3D contact geometry with a compact robot finger,” in Proc. IEEE Int. Conf. Robot. Autom., 2021, pp. 6468–6475.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.