A Survey on Hand Pose Estimation with Wearable Sensors and Computer-Vision-Based Methods

Sensors (Basel). 2020 Feb 16;20(4):1074. doi: 10.3390/s20041074.

Abstract

Real-time sensing and modeling of the human body, especially the hands, is an important research endeavor for various applicative purposes such as in natural human computer interactions. Hand pose estimation is a big academic and technical challenge due to the complex structure and dexterous movement of human hands. Boosted by advancements from both hardware and artificial intelligence, various prototypes of data gloves and computer-vision-based methods have been proposed for accurate and rapid hand pose estimation in recent years. However, existing reviews either focused on data gloves or on vision methods or were even based on a particular type of camera, such as the depth camera. The purpose of this survey is to conduct a comprehensive and timely review of recent research advances in sensor-based hand pose estimation, including wearable and vision-based solutions. Hand kinematic models are firstly discussed. An in-depth review is conducted on data gloves and vision-based sensor systems with corresponding modeling methods. Particularly, this review also discusses deep-learning-based methods, which are very promising in hand pose estimation. Moreover, the advantages and drawbacks of the current hand gesture estimation methods, the applicative scope, and related challenges are also discussed.

Keywords: computer vision; data gloves; deep learning; hand pose estimation; human–computer interaction; wearable devices.

Publication types

  • Review

MeSH terms

  • Algorithms
  • Artificial Intelligence*
  • Biomechanical Phenomena
  • Hand / physiology*
  • Humans
  • User-Computer Interface
  • Wearable Electronic Devices*