This work presents the development and implementation of a unified multi-sensor human motion capture and gesture recognition system that can distinguish between and classify six different gestures. Data was collected from eleven participants using a subset of five wireless motion sensors (inertial measurement units) attached to their arms and upper body from a complete motion capture system. We compare Support Vector Machines and Artificial Neural Networks on the same dataset under two different scenarios and evaluate the results. Our study indicates that near perfect classification accuracies are achievable for small gestures and that the speed of classification is sufficient to allow interactivity. However, such accuracies are more difficult to obtain when a participant does not participate in training, indicating that more work needs to be done in this area to create a system that can be used by the general population.

Additional Metadata
Keywords Artificial neural networks, Gesture recognition, Machine learning, Pattern analysis, Quaternions, Support vector machines, Wearable sensors
Persistent URL dx.doi.org/10.3390/s16050605
Journal Sensors
Citation
Alavi, S. (Shamir), Arsenault, D. (Dennis), & Whitehead, A. (2016). Quaternion-based gesture recognition using wireless wearable motion capture sensors. Sensors, 16(5). doi:10.3390/s16050605