TR2021-031

Comparison of Three Feedback Modalities for Haptics Sensation in Remote Machine Manipulation


    •  Haruna, M., Ogino, M., Koike-Akino, T., "Comparison of Three Feedback Modalities for Haptics Sensation in Remote Machine Manipulation", IEEE Robotics and Automation Letters, DOI: 10.1109/​LRA.2021.3070301, Vol. 6, No. 3, pp. 5040-5047, March 2021.
      BibTeX TR2021-031 PDF
      • @article{Haruna2021mar,
      • author = {Haruna, Masaki and Ogino, Masaki and Koike-Akino, Toshiaki},
      • title = {Comparison of Three Feedback Modalities for Haptics Sensation in Remote Machine Manipulation},
      • journal = {IEEE Robotics and Automation Letters},
      • year = 2021,
      • volume = 6,
      • number = 3,
      • pages = {5040--5047},
      • month = mar,
      • doi = {10.1109/LRA.2021.3070301},
      • issn = {2377-3766},
      • url = {https://www.merl.com/publications/TR2021-031}
      • }
  • MERL Contact:
  • Research Areas:

    Machine Learning, Robotics, Signal Processing

Abstract:

Previous studies have verified the usefulness of visual haptics for achieving the appropriate grasping force and task success rate to operate remote machines. However, its capabilities have not been evaluated objectively and quantitatively. We comprehensively compare three feedback modalities (i.e., sound, vibration, and light) for providing pseudo-haptic information on contact with an object, which we apply to grasping an object with a remotely operated robot arm. Experimental results verify that the light modality (i.e., visual haptics) minimizes the grasping force and processing load in the operator’s brain. We then develop a prototype of a remote machine to demonstrate the feasibility of visual haptic feedback. We consider three implementations (i.e., a light-emitting diode, model-based superimposition, and model-less superimposition) to verify the performance. The results show that visual haptics can stabilize the performance of delicate tasks such as grasping and carrying fragile raw eggs and potato chips. We demonstrate that our visual haptics method (i.e., superimposing haptic information as images on the contact points of the robot’s fingertips) can significantly improve the operability of remote machines without the need for highly complex and expensive interfaces.