Hand gesture recognition for user interaction in Augmented Reality game
Keywords:
Hand Gesture Recognition, Human-Computer Interaction, Augmented Reality, Data Fusion, Transfer Learning, Deep LearningAbstract
This paper presents a hand gesture recognition system for an interactive augmented reality game, utilizing skeletal and image data to improve accuracy. We collected a comprehensive dataset of hand gestures comprising RGB images and skeletal coordinates for five distinct gestures. A Late Fusion model, which combines skeletal data with RGB image information, was proposed and achieved a test accuracy of 88.20%. This model was successfully integrated into a Unity 3D game, allowing players to control in-game actions through intuitive hand gestures. Experimental results demonstrate the effectiveness of the proposed approach in enhancing user interaction and delivering a highly responsive gaming experience in AR environments.