Loading [MathJax]/extensions/MathMenu.js
Deep convolutional neural networks for motion instability identification using kinect | IEEE Conference Publication | IEEE Xplore

Deep convolutional neural networks for motion instability identification using kinect


Abstract:

Evaluating the execution style of human motion can give insight into the performance and behaviour exhibited by the participant. This could enable support in developing p...Show More

Abstract:

Evaluating the execution style of human motion can give insight into the performance and behaviour exhibited by the participant. This could enable support in developing personalised rehabilitation programmes by providing better understanding of motion mechanics and contextual behaviour. However, performing analyses, generating statistical representations and models which are free from external bins, repeatable and robust is a difficult task. In this work, we propose a framework which evaluates clinically valid motions to identify unstable behaviour during performance using Deep Convolutional Neural Networks. The framework is composed of two parts; 1) Instead of using the whole skeleton as input, we divide the human skeleton into five joint groups. For each group, feature encoding is used to represent spatial and temporal domains to permit high-level abstraction and to remove noise these are then represented using distance matrices. 2) The encoded representations are labelled using an automatic labelling method and evaluated using deep learning. Experimental results demonstrates the ability to correctly classify data compared to classical approaches.
Date of Conference: 08-12 May 2017
Date Added to IEEE Xplore: 20 July 2017
ISBN Information:
Conference Location: Nagoya, Japan
References is not available for this document.

1 Introduction

There has been significant interest in digital analysis methods for detection and quantification of human motion for use in electronic health interventions [1]. This, in part, is due to the increased availability of low-cost multi-modality marker-less capturing devices. Sensor technology (e.g. Microsoft Kinect) offers new dimensions by harnessing multiple techniques such as feature extraction and encoding. This has been observed in the work of Bigy et al. [2], they proposed a technique for recognising posture and Freezing of Gait in those with Parkinsons disease to aid in detecting trips and falls within the home. Yang et al. [3] implemented a framework that extracts both depth and colour image data from the Kinect to assess the posture of participants when performing standing balance, the framework allowed for detection of subtle directional changes such as postural sway.

Select All
1.
B. F. Mentiplay, L. G. Perraton, K. J. Bower, Y. H. Pua, R. McGaw, S. Heywood, et al., "Gait assessment using the microsoft xbox one kinect: Concurrent validity and inter-day reliability of spatiotemporal and kinematic variables", Biomechanics, vol. 48, no. 10, pp. 2166-70, 2015.
2.
A. Amini, Maghsoud Bigy, K. Banitsas, A. Badii and J. Cosmas, "Recognition of postures and freezing of gait in parkinson's disease patients using microsoft kinect sensor", IEEE Conference on Neural Engineering, pp. 731-734, 2015.
3.
Y. Yang, F. Pu, Y. Li, S. Li, Y. Fan and D. Li, "Reli-ability and validity of kinect rgb-d sensor for assessing standing balance", Sensors, pp. 1633-1638, 2014.
4.
D. Leightley, M. H. Yap, J. Coulson, M. Piasecki, J. Cameron, Y. Barnouin, et al., "Postural stability during standing balance and sit-to-stand in master athlete runners compared with non-athletic old and young adults", Journal of Aging and Physical Activity, 2016.
5.
R. Clark, Y. H. Pua, K. Fortin, C. Ritchie, K. Webster, L. Denehy, et al., "Validity of the microsoft kinect for assessment of postural control", Gait and Posture, vol. 36, no. 3, pp. 372-377, 2012.
6.
Y. Du, W. Wang and L. Wang, "Hierarchial recurrent neural network for skeleton based action recognition", IEEE Conference on ComputerVision and Pattern Recognition, pp. 1110-1118, 2015.
7.
D. Leightley, J. S. McPhee and M. H. Yap, "Auto-mated analysis and quantification of human mobility using a depth sensor", IEEE Journal of Biomedical and Health Informatics, 2016.
8.
W. Zhu, C. Lan, J. Xing, W. Zeng, Y. Li, L. Shen, et al., Co-occurrence feature learning for skeleton based action recognition using regularized deep LSTM networks” CoRR, vol. abs/1603.07772, 2016.
9.
D. Leightley, M. H. Yap, B. M. Hewitt and J. S. McPhee, "Sensing behaviour using the kinect: Identifying characteristic features of instability and poor performance during challenging balancing tasks", Measuring Behavior 2016, May 2016.
10.
J. Darby, B. Li, R. Cunningham and N. Costen, "Ob-ject localisation via action recognition", IEEE Conference on Pattern Recognition, pp. 817-820, 2012.
11.
J. Wang, Z. Liu, Y. Wu and J. Yuan, "Mining action-let ensemble for action recognition with depth cameras", IEEE Conference on ComputerVision and Pattern Recognition, pp. 1290-1297, 2012.
12.
S. Li, W. Zhang and A. B. Chan, "Maximum-margin structured learning with deep networks for 3d human pose estimation", The IEEE International Conference on Computer Vision (ICCV), December 2015.
13.
E. P. Ijjina and C. K. Mohan, "Human action recognition based on mocap information using convolution neural networks", Machine Learning and Applications (ICMLA) 2014 13th International Conference, pp. 159-164, Dec 2014.
14.
D. Leightley, M. H. Yap, Y. B. J. Coulson and J. S. Mcphee, "Benchmarking human motion analysis using kinect one: an open source dataset", 2015 Asia-Pacific Signal and Information Processing AssociationAnnual Summit and Conference (A PSIPA), pp. 1-7, Dec 2015.
15.
J. Guralnik, E. Simonsick, L. Ferrucci, R. Glynn, L. Berkman, D. Blazer, et al., "A short physical performance battery assessing lower extremity function: Association with self-reported disability and prediction of mortality and nursing home admission", Gerontology, vol. 49, no. 2, pp. 85-93, March 1994.
16.
M. Firman, "RGBD datasets: Past present and future", Computer Vision and Pattern Recognition WorkShop, 2016.
Contact IEEE to Subscribe

References

References is not available for this document.