Human motion can be carried out with a variety of different affects or styles such as happy, sad, energetic, and tired among many others. Modeling and classifying these styles, and more importantly, translating them from one sequence onto another has become a popular problem in the fields of graphics, multimedia, and human computer interaction. In this paper, radial basis functions (RBF) are used to model and extract stylistic and affective features from motion data. We demonstrate that using only a few basis functions per degree of freedom, successful modeling of styles in cycles of human walk can be achieved. Furthermore, we employ an ensemble of RBF neural networks to learn the affective/stylistic features following time warping and principal component analysis. The system learns the components and classifies stylistic motion sequences into distinct affective and stylistic classes. The system also utilizes the ensemble of neural networks to learn motion affects and styles such that it can translate them onto neutral input sequences. Experimental results along with both numerical and perceptual validations confirm the highly accurate and effective performance of the system.

Additional Metadata
Keywords Classification, Motion capture, Neural networks, Radial basis functions, Style translation
Persistent URL dx.doi.org/10.1016/j.neucom.2013.09.001
Journal Neurocomputing
Citation
Ali Etemad, S., & Arya, A. (2014). Classification and translation of style and affect in human motion using RBF neural networks. Neurocomputing, 129, 585–595. doi:10.1016/j.neucom.2013.09.001