Home >

news Help

Publication Information


Title
Japanese: 
English:Egocentric Human Activities Recognition With Multimodal Interaction Sensing 
Author
Japanese: HAO Yuzhe, 金崎 朝子, 佐藤 育郎, 川上 玲, 篠田 浩一.  
English: Yuzhe Hao, Asako Kanezaki, Ikuro Sato, Rei Kawakami, Koichi Shinoda.  
Language English 
Journal/Book name
Japanese: 
English:IEEE Sensors Journal 
Volume, Number, Page Vol. 24    No. 5    7085 - 7096
Published date Mar. 1, 2024 
Publisher
Japanese: 
English:IEEE 
Conference name
Japanese: 
English: 
Conference site
Japanese: 
English: 
DOI https://doi.org/10.1109/JSEN.2023.3349191
Abstract Egocentric human activity recognition (ego-HAR) has received attention in fields where human intentions in a video must be estimated. However, the performance of existing methods is limited due to insufficient information about the subject’s motion in egocentric videos. To overcome the problem, we proposed to use two hands’ inertial sensor data as supplements for egocentric videos to do the ego-HAR task. For this purpose, we construct a publicly available dataset, egocentric video, and inertial sensor data kitchen (EvIs-Kitchen), which contains well-synchronized egocentric videos and two-hand inertial sensor data and includes interaction-focus actions as recognition targets. We also designed the optimal choices of input combination and component variants through experiments under two-branch late-fusion architecture. The results show our multimodal setup outperforms any other single-modal methods on EvIs-Kitchen.

©2007 Tokyo Institute of Technology All rights reserved.