TR2022-055
Learning to Synthesize Volumetric Meshes from Vision-based Tactile Imprints
-
- "Learning to Synthesize Volumetric Meshes from Vision-based Tactile Imprints", IEEE International Conference on Robotics and Automation (ICRA), DOI: 10.1109/ICRA46639.2022.9812092, May 2022, pp. 4833-4839.BibTeX TR2022-055 PDF
- @inproceedings{Zhu2022may2,
- author = {Zhu, Xinghao and Jain, Siddarth and Tomizuka, Masayoshi and van Baar, Jeroen},
- title = {Learning to Synthesize Volumetric Meshes from Vision-based Tactile Imprints},
- booktitle = {2022 IEEE International Conference on Robotics and Automation (ICRA)},
- year = 2022,
- pages = {4833--4839},
- month = may,
- publisher = {IEEE},
- doi = {10.1109/ICRA46639.2022.9812092},
- isbn = {978-1-7281-9681-7},
- url = {https://www.merl.com/publications/TR2022-055}
- }
,
- "Learning to Synthesize Volumetric Meshes from Vision-based Tactile Imprints", IEEE International Conference on Robotics and Automation (ICRA), DOI: 10.1109/ICRA46639.2022.9812092, May 2022, pp. 4833-4839.
-
MERL Contact:
-
Research Areas:
Abstract:
Vision-based tactile sensors typically utilize a de-formable elastomer and a camera mounted above to provide high-resolution image observations of contacts. Obtaining accu-rate volumetric meshes for the deformed elastomer can provide direct contact information and benefit robotic grasping and manipulation. This paper focuses on learning to synthesize the volumetric mesh of the elastomer based on the image imprints acquired from vision-based tactile sensors. Synthetic image-mesh pairs and real-world images are gathered from 3D finite element methods (FEM) and physical sensors, respectively. A graph neural network (GNN) is introduced to learn the image- to-mesh mappings with supervised learning. A self-supervised adaptation method and image augmentation techniques are proposed to transfer networks from simulation to reality, from primitive contacts to unseen contacts, and from one sensor to another. Using these learned and adapted networks, our proposed method can accurately reconstruct the deformation of the real-world tactile sensor elastomer in various domains, as indicated by the quantitative and qualitative results.
Related News & Events
-
NEWS MERL researchers presented 5 papers and an invited workshop talk at ICRA 2022 Date: May 23, 2022 - May 27, 2022
Where: International Conference on Robotics and Automation (ICRA)
MERL Contacts: Ankush Chakrabarty; Stefano Di Cairano; Siddarth Jain; Devesh K. Jha; Pedro Miraldo; Daniel N. Nikovski; Arvind Raghunathan; Diego Romeres; Abraham P. Vinod; Yebin Wang
Research Areas: Artificial Intelligence, Machine Learning, RoboticsBrief- MERL researchers presented 5 papers at the IEEE International Conference on Robotics and Automation (ICRA) that was held in Philadelphia from May 23-27, 2022. The papers covered a broad range of topics from manipulation, tactile sensing, planning and multi-agent control. The invited talk was presented in the "Workshop on Collaborative Robots and Work of the Future" which covered some of the work done by MERL researchers on collaborative robotic assembly. The workshop was co-organized by MERL, Mitsubishi Electric Automation's North America Development Center (NADC), and MIT.
Related Publication
- @article{Zhu2022mar,
- author = {Zhu, Xinghao and Jain, Siddarth and Tomizuka, Masayoshi and van Baar, Jeroen},
- title = {Learning to Synthesize Volumetric Meshes from Vision-based Tactile Imprints},
- journal = {arXiv},
- year = 2022,
- month = mar,
- doi = {10.1109/ICRA46639.2022.9812092},
- isbn = {978-1-7281-9681-7},
- url = {https://ieeexplore.ieee.org/document/9812092}
- }