Online Social Touch Pattern Recognition with Multi-modal-sensing Modular Tactile Interface

Hyun Jin Ku, Jason J. Choi, Sunho Jang, Wonkyung Do, Soomin Lee, Sangok Seok
2019 16th International Conference on Ubiquitous Robots (UR) Jeju, Korea, June 24-27, 2019

The capability of recognizing various social touch patterns is necessary for robots functioning for touch-based social interaction, which is effective in many robot applications. Literature has focused on the novelty of the recognition system or improvements in classification accuracy based on publicly available datasets. In this paper, we propose an integrated framework of implementing social touch recognition system for various robots, which consists of three complementary principles: 1) multi-modal tactile sensing, 2) a modular design, and 3) a social touch pattern classifier capable of learning temporal features. The approach is evaluated by an implemented Multi-modal-sensing Modular Tactile Interface prototype, while for the classifiers, three learning methods-HMM, LSTM, and 3D-CNN-have been tested. The trained classifiers, which can run online in robot's embedded system, predict 18 classes of social touch pattern. Results of the online validation test offer that all three methods are promising with the best accuracy of 88.86%. Especially, the stable performance of 3D-CNN indicates that learning `spatiotemporal' features from tactile data would be more effective. Through this validation process, we have confirmed that our framework can be easily adopted and secures robust performance for social touch pattern recognition.

파일 다운로드
Online Social Touch Pattern Recognition with Multi-modal-sensing Modular Tactile Interface.pdf (1.9MB)