TR2022-058

Synthesizing and Simulating Volumetric Meshes from Vision-based Tactile Imprints


    •  Zhu, X., Jain, S., Tomizuka, M., van Baar, J., "Synthesizing and Simulating Volumetric Meshes from Vision-based Tactile Imprints", ICRA 2022 Workshop on Reinforcement Learning for Contact-Rich Manipulation, May 2022.
      BibTeX TR2022-058 PDF
      • @inproceedings{Zhu2022may3,
      • author = {Zhu, Xinghao and Jain, Siddarth and Tomizuka, Masayoshi and van Baar, Jeroen},
      • title = {Synthesizing and Simulating Volumetric Meshes from Vision-based Tactile Imprints},
      • booktitle = {ICRA 2022 Workshop on Reinforcement Learning for Contact-Rich Manipulation},
      • year = 2022,
      • month = may,
      • url = {https://www.merl.com/publications/TR2022-058}
      • }
  • MERL Contact:
  • Research Areas:

    Machine Learning, Robotics

Abstract:

Vision-based tactile sensors typically employ a deformable elastomer and a camera to provide high-resolution contact images. This work focuses on learning to simulate and synthesize the volumetric mesh of the elastomer based on the image imprints acquired from tactile sensors. Obtaining accurate volumetric meshes for the elastomer can provide direct contact information and benefit robotic grasping and manipulation. Our method [1] proposes a train-then-adapt way to leverage synthetic image-mesh pairs and real-world images from finite element methods (FEM) and physical sensors. Our approach can accurately reconstruct the deformation of the real-world tactile sensor elastomer in various domains. While the proposed learning approaches have shown to produce solutions, we discuss some limitations and challenges for viable real-world applications.

 

  • Related News & Events