TR2023-077

Tactile-Filter: Interactive Tactile Perception for Part Mating


    •  Ota, K., Jha, D.K., Tung, H.-Y., Tenenbaum, J.B., "Tactile-Filter: Interactive Tactile Perception for Part Mating", Robotics: Science and Systems (RSS), DOI: 10.15607/​RSS.2023.XIX.079, June 2023.
      BibTeX TR2023-077 PDF Video
      • @inproceedings{Ota2023jun,
      • author = {Ota, Kei and Jha, Devesh K. and Tung, Hsiao-Yu and Tenenbaum, Joshua B.},
      • title = {Tactile-Filter: Interactive Tactile Perception for Part Mating},
      • booktitle = {Robotics: Science and Systems (RSS)},
      • year = 2023,
      • month = jun,
      • doi = {10.15607/RSS.2023.XIX.079},
      • url = {https://www.merl.com/publications/TR2023-077}
      • }
  • MERL Contact:
  • Research Area:

    Robotics

Abstract:

Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks. Our tactile sensing provides us with a lot of information regarding contact formations as well as geometric information about objects during any interaction. With this motivation, vision-based tactile sensors are being widely used for various robotic perception and control tasks. In this paper, we present a method for interactive perception using vision- based tactile sensors for a part mating task, where a robot can use tactile sensors and a feedback mechanism using a particle filter to incrementally improve its estimate of objects (pegs and holes) that fit together. To do this, we first train a deep neural network that makes use of tactile images to predict the proba- bilistic correspondence between arbitrarily shaped objects that fit together. The trained model is used to design a particle filter which is used twofold. First, given one partial (or non-unique) observation of the hole, it incrementally improves the estimate of the correct peg by sampling more tactile observations. Second, it selects the next action for the robot to sample the next touch (and thus image) which results in maximum uncertainty reduction to minimize the number of interactions during the perception task. We evaluate our method on several part-mating tasks with novel objects using a robot equipped with a vision-based tactile sensor. We also show the efficiency of the proposed action selection method against a naive method.
See supplementary video at https://www.youtube.com/watch?v=jMVBg e3gLw.

 

  • Related News & Events

    •  NEWS    MERL researchers present 3 papers on Dexterous Manipulation at RSS 23.
      Date: July 11, 2023
      Where: Daegu, Korea
      MERL Contacts: Siddarth Jain; Devesh K. Jha; Arvind Raghunathan
      Research Areas: Artificial Intelligence, Machine Learning, Robotics
      Brief
      • MERL researchers presented 3 papers at the 19th edition of Robotics:Science and Systems Conference in Daegu, Korea. RSS is the flagship conference of the RSS foundation and is run as a single track conference presenting a limited number of high-quality papers. This year the main conference had a total of 112 papers presented. MERL researchers presented 2 papers in the main conference on planning and perception for dexterous manipulation. Another paper was presented in a workshop of learning for dexterous manipulation. More details can be found here https://roboticsconference.org.
    •  
  • Related Video

  • Related Publication

  •  Ota, K., Jha, D.K., Tung, H.-Y., Tenenbaum, J.B., "Tactile-Filter: Interactive Tactile Perception for Part Mating", arXiv, March 2023.
    BibTeX arXiv
    • @article{Ota2023mar,
    • author = {Ota, Kei and Jha, Devesh K. and Tung, Hsiao-Yu and Tenenbaum, Joshua B.},
    • title = {Tactile-Filter: Interactive Tactile Perception for Part Mating},
    • journal = {arXiv},
    • year = 2023,
    • month = mar,
    • url = {https://arxiv.org/abs/2303.06034}
    • }