Aitsam, Muhammad, Davies, Sergio and Di Nuovo, Alessandro (2024) Event camera-based real-time gesture recognition for improved robotic guidance. In: 2024 International Joint Conference on Neural Networks (IJCNN), 30 June 2024 - 5 July 2024, Yokohama, Japan.
|
Accepted Version
Available under License Creative Commons Attribution. Download (482kB) | Preview |
Abstract
Recent breakthroughs in event-based vision, driven by the capabilities of high-resolution event cameras, have significantly improved human-robot interactions. Event cameras excel in managing dynamic range and motion blur, seamlessly adapting to various environmental conditions. The research presented in this paper leverages this technology to develop an intuitive robot guidance system capable of interpreting hand gestures for precise robot control. We introduce the "EB-HandGesture" dataset, an innovative high-resolution hand-gesture dataset used in conjunction with our network "ConvRNN" to demonstrate commendable accuracy of 95.7% in the interpretation task, covering six gesture types in different lighting scenarios. To validate our framework, real-life experiments were conducted with the ARI robot, confirming the effectiveness of the trained network during the various interaction processes. This research represents a substantial leap forward in ensuring safer, more reliable and more efficient human-robot collaboration in shared workspaces.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.