Loading [MathJax]/extensions/MathMenu.js
From Activity Recognition to Intention Recognition for Assisted Living Within Smart Homes | IEEE Journals & Magazine | IEEE Xplore

From Activity Recognition to Intention Recognition for Assisted Living Within Smart Homes


Abstract:

The global population is aging; projections show that by 2050, more than 20% of the population will be aged over 64. This will lead to an increase in aging related illnes...Show More

Abstract:

The global population is aging; projections show that by 2050, more than 20% of the population will be aged over 64. This will lead to an increase in aging related illness, a decrease in informal support, and ultimately issues with providing care for these individuals. Assistive smart homes provide a promising solution to some of these issues. Nevertheless, they currently have issues hindering their adoption. To help address some of these issues, this study introduces a novel approach to implementing assistive smart homes. The devised approach is based upon an intention recognition mechanism incorporated into an intelligent agent architecture. This approach is detailed and evaluated. Evaluation was performed across three scenarios. Scenario 1 involved a web interface, focusing on testing the intention recognition mechanism. Scenarios 2 and 3 involved retrofitting a home with sensors and providing assistance with activities over a period of 3 months. The average accuracy for these three scenarios was 100%, 64.4%, and 83.3%, respectively. Future will extend and further evaluate this approach by implementing advanced sensor-filtering rules and evaluating more complex activities.
Published in: IEEE Transactions on Human-Machine Systems ( Volume: 47, Issue: 3, June 2017)
Page(s): 368 - 379
Date of Publication: 05 January 2017

ISSN Information:

Funding Agency:

References is not available for this document.

I. Introduction

The worldwide population is ageing and is resulting in an uneven demographic composition [1], [2]. This is expected to reach a situation where by 2050 more than 20% of the population will be aged over 64 [1], [2]. This growth in the aging population is expected to produce an increase in age-related illness which, in turn, will place additional burdens on healthcare provision [2]. In addition, the amount of informal support available will decrease due to a reduction in the global potential support ratio (PSR). The PSR is the ratio of people that comprises the working age (15–64) to those older than 64 [1]. The PSR is expected to continue on a downward trend reaching a low of 4:1 by 2050. The PSR was previously 12:1 in 1950 and more recently 9:1 in 2009 [1].

Select All
1.
United Nations, World Population Ageing 2009, New York, NY, USA:United Nations, 2010.
2.
E. De Luca dAlessandro, S. Bonacci and G. Giraldi, "Aging populations: The health and quality of life of the elderly", Clin. Terapeutica, vol. 162, no. 1, pp. e13-e28, 2011.
3.
D. J. Cook and S. K. Das, "How smart are our environments? An updated look at the state of the art", Pervasive Mobile Comput., vol. 3, no. 2, pp. 53-73, Mar. 2007.
4.
M. Chan, D. Estève, C. Escriba and E. Campo, "A review of smart homes-present state and future challenges", Comput. Methods Programs Biomed., vol. 91, no. 1, pp. 55-81, Jul. 2008.
5.
L. Chen, J. Hoey, C. D. Nugent, D. J. Cook and Z. Yu, "Sensor-based activity recognition", IEEE Trans. Syst. Man Cybern. C Appl. Rev., vol. 42, no. 6, pp. 790-808, Nov. 2012.
6.
M. P. Poland, C. D. Nugent, H. Wang and L. Chen, "Smart home research: Projects and issues", Int. J. Ambient Comput. Intell., vol. 1, no. 4, pp. 32-45, Jan. 2009.
7.
L. Bao and S. Intille, "Activity recognition from user-annotated acceleration data", Pervasive Comput., vol. 3001, pp. 1-17, 2004.
8.
E. Tapia, S. S. Intille and K. Larson, "activity recognition in the home using simple and ubiquitous sensors pervasive computing" in Pervasive Computing, Berlin, Germany:Springer, vol. 3001, pp. 158-175, 2004.
9.
D. J. Cook and M. Schmitter-Edgecombe, "Assessing the quality of activities in a smart environment", Methods Inf. Med., vol. 48, no. 5, pp. 480-485, 2009.
10.
T. van Kasteren and B. Krose, "Bayesian activity recognition in residence for elders", Proc. 3rd IET Int. Conf. Intell. Environ., vol. 2007, pp. 209-212, 2007.
11.
U. Maurer, A. Rowe, A. Smailagic and D. Siewiorek, "Location and activity recognition using ewatch: A wearable sensor platform" in Ambient Intelligence in Everyday Life, Berlin, Heidelberg:Springer-Verlag, pp. 86-102, 2006.
12.
N. Ravi, N. Dandekar, P. Mysore and M. L. Littman, "Activity recognition from accelerometer data", Proc. 17th Conf. Innovative Appl. Artif. Intell., vol. 3, pp. 1541-1546, 2005.
13.
M. Stikic and B. Schiele, "Activity recognition from sparsely labeled data using multi-instance learning", Location Context Awareness, vol. 5561, pp. 156-173, 2009.
14.
D. L. Vail, M. M. Veloso and J. D. Lafferty, "Conditional random fields for activity recognition", Proc. 6th Int. Joint Conf. Auton. Agents Multiagent Syst., pp. 235:1-235:8, 2007.
15.
C. Sutton, A. McCallum and K. Rohanimanesh, "Dynamic conditional random fields: Factorized probabilistic models for labeling and segmenting sequence data", J. Mach. Learn. Res., vol. 8, pp. 693-723, May 2007.
16.
O. Brdiczka, J. L. Crowley and P. Reignier, "Learning situation models in a smart home", IEEE Trans. Syst. Man. Cybern. B Cybern., vol. 39, no. 1, pp. 56-63, Feb. 2009.
17.
L. Chen and C. Nugent, "A logical framework for behaviour reasoning and assistance in a smart home", Int. J. Assist. Robot. Mechatron., vol. 9, pp. 20-34, 2008.
18.
B. Bouchard, S. Giroux and A. Bouzouane, "A smart home agent for plan recognition", Adv. Artif. Intell., vol. 1, no. 5, pp. 53-62, 2006.
19.
L. Chen, C. D. Nugent and H. Wang, "A knowledge-driven approach to activity recognition in smart homes", IEEE Trans. Knowl. Data Eng., vol. 24, no. 6, pp. 961-974, Jun. 2012.
20.
N. Yamada, K. Sakamoto and G. Kunito, "Applying ontology and probabilistic model to human activity recognition from surrounding things", IPSJ Digit. Courier, vol. 3, pp. 506-517, 2007.
21.
V. Vassilev, M. Ulman and K. Ouazzane, "Ontocarer: An ontological framework for assistive agents for the disabled", Proc. 3rd Int. Conf. Digit. Inf. Process. Commun., pp. 404-416, 2013.
22.
F. Latfi, "Ontology-based management of the telehealth smart home dedicated to elderly in loss of cognitive autonomy", Proc. CEUR Workshop, 2007.
23.
M. Klein, A. Schmidt and R. Lauer, "Ontology-centred design of an ambient middleware for assisted living: The case of soprano", Proc. 30th Annu. German Conf. Artif. Intell., 2007.
24.
R. Hervás, J. Bravo, J. Fontecha and V. Villarreal, "Achieving adaptive augmented reality through ontological context-awareness applied to AAL scenarios", J. Univ. Comput. Sci., vol. 19, no. 9, pp. 1334-1349, 2013.
25.
B. Chandrasekaran, J. R. Josephson and V. R. Benjamins, "What are ontologies and why do we need them?", IEEE Intell. Syst., vol. 14, no. 1, pp. 20-26, Jan. 1999.
26.
B. Das, N. C. Krishnan and D. J. Cook, "Automated activity interventions to assist with activities of daily living", Agents Ambient Intell. Achievement Challenges Intersect. Agent Technol. Ambient Intell., vol. 12, pp. 137-159, 2000.
27.
P. Rashidi and A. Mihailidis, "A survey on ambient-assisted living tools for older adults", IEEE J. Biomed. Health Informat., vol. 17, no. 3, pp. 579-590, May 2013.
28.
J. Rafferty, L. Chen, C. Nugent and J. Liu, "Goal lifecycles and ontological models for intention based assistive living within smart environments", Comput. Syst. Sci. Eng., vol. 30, no. 1, pp. 7-18, 2015.
29.
J. Rafferty, C. Nugent, J. Liu and L. Chen, "Automatic metadata generation through analysis of narration within instructional videos", J. Med. Syst., vol. 39, no. 9, pp. 1-7, 2015.
30.
J. Rafferty, C. Nugent, J. Liu and L. Chen, "A mechanism for nominating video clips to provide assistance for instrumental activities of daily living" in Ambient Assisted Living and Daily Activities, Puerto Varas, Chile:Springer, vol. 9455, pp. 65-76, 2015.

Contact IEEE to Subscribe

References

References is not available for this document.