A dynamic modelling framework for human hand gesture task recognition

Sara Masoud, Bijoy Chowdhury, Young Jun Son, Chieri Kubota, Russell Tronstad

Research output: Contribution to conferencePaperpeer-review

2 Scopus citations

Abstract

Gesture recognition and hand motion tracking are important tasks in advanced gesture based interaction systems. In this paper, we propose to apply a sliding windows filtering approach to sample the incoming streams of data from data gloves and a decision tree model to recognize the gestures in real time for a manual grafting operation of a vegetable seedling propagation facility. The sequence of these recognized gestures defines the tasks that are taking place, which helps to evaluate individuals' performances and to identify any bottlenecks in real time. In this work, two pairs of data gloves are utilized, which reports the location of the fingers, hands, and wrists wirelessly (i.e., via Bluetooth). To evaluate the performance of the proposed framework, a preliminary experiment was conducted in multiple lab settings of tomato grafting operations, where multiple subjects wear the data gloves while performing different tasks. Our results show an accuracy of 91% on average, in terms of gesture recognition in real time by employing our proposed framework.

Original languageEnglish (US)
Pages563-568
Number of pages6
StatePublished - 2018
Event2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018 - Orlando, United States
Duration: May 19 2018May 22 2018

Other

Other2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018
Country/TerritoryUnited States
CityOrlando
Period5/19/185/22/18

Keywords

  • Decision tree
  • Hand gesture
  • K-means
  • Task recognition

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Industrial and Manufacturing Engineering

Fingerprint

Dive into the research topics of 'A dynamic modelling framework for human hand gesture task recognition'. Together they form a unique fingerprint.

Cite this