This paper presents a method for automatic temporal location and recognition of human actions. The data are obtained from a motion capture system. They are then animated and optical flow vectors are subsequently calculated. The system performs in two phases. The first phase employs nearest neighbor search to locate an action along the temporal axis taking into account both the angle and length of the vectors, while the second classifies the action using artificial neural networks. Principal Component Analysis (PCA) plays a significant role in discarding correlated flow vectors. We perform a statistical analysis in order to achieve an efficient, adaptive and targeted PCA. This will greatly improve the configuration of flow vectors which we have used to train both the locating and classifying systems. Experimental results confirm the significance of our proposed method for locating and classifying a specific action from among a sequential combination of actions.

Additional Metadata
Keywords Classification, Human actions, Neural networks, Principal component analysis, Temporal location
Persistent URL dx.doi.org/10.1109/CISP.2009.5304683
Conference 2009 2nd International Congress on Image and Signal Processing, CISP'09
Citation
Etemad, S.A. (Seyed Ali), Payeur, P. (Pierre), & Arya, A. (2009). Automatic temporal location and classification of human actions based on optical features. Presented at the 2009 2nd International Congress on Image and Signal Processing, CISP'09. doi:10.1109/CISP.2009.5304683