Content uploaded by Thanos G. Stavropoulos
Author content
All content in this area was uploaded by Thanos G. Stavropoulos on Feb 27, 2019
Content may be subject to copyright.
Dem@Home: Ambient Intelligence for Clinical Support
of People Living with Dementia
Stelios Andreadis, Thanos G. Stavropoulos, Georgios Meditskos
and Ioannis Kompatsiaris
Information Technologies Institute, Center for Research and Technologies - Hellas, Greece
{andreadisst, athstavr, gmeditsk, ikom}@iti.gr
Abstract. With the ever-growing prevalence of dementia, nursing costs are in-
creasing, while the ability to live independently vanishes. Dem@Home is an
ambient assisted living framework to support independent living while receiv-
ing intelligent clinical care. Dem@Home integrates a variety of ambient and
wearable sensors together with sophisticated, interdisciplinary methods of im-
age and semantic analysis. Semantic Web technologies, such as OWL 2, are ex-
tensively employed to represent sensor observations and application domain
specifics as well as to implement hybrid activity recognition and problem detec-
tion. Complete with tailored user interfaces, clinicians are provided with accu-
rate monitoring of multiple life aspects, such as physical activity, sleep, com-
plex daily tasks and clinical problems, leading to adaptive non-pharmaceutical
interventions. The method has been already validated for both recognition per-
formance and improvement on a clinical level, in four home pilots.
Keywords: ambient assisted living, sensors, semantic web, ontologies, reason-
ing, context-awareness, dementia
1 Introduction
The increase of the average lifespan across the world has been accompanied by an
unprecedented upsurge in the occurrence of dementia, with high socio-economic
costs, reaching 818 billion US dollars worldwide, in 2015. Nevertheless, its preva-
lence is increasing as the number of people aged 65 and older with Alzheimer's dis-
ease may nearly triple by 2050, from 46.8 million to 131 million people around the
world, the majority of which, living in an institution [1].
Assistive technologies could enhance clinicians’ diagnosis and decision making, in
order to meet individual needs, but also to be used as an objective assessing measure
of cognitive status and disease progress of patients. Furthermore, assistive technology
is expected to play a critical role in improving patients’ quality of life, both on cogni-
tive and physical level, whereas cost is reduced. Drawbacks of current health services
are that they often aim to evaluate single needs (e.g. pharmacological treatment) or
detect problems solely via interviews, leading to generic interventions by clinicians.
However, home remote monitoring of patients is a promising “patient-centered” man-
agement approach that provides specific and reliable data, enabling the clinicians to
monitor patients’ daily function and provide adaptive and personalized interventions.
Towards this direction, we propose Dem@Home, a holistic approach for context-
aware monitoring and personalized care of dementia at homes, prolonging independ-
ent living. To begin with, the system integrates a wide range of sensor modalities and
high-level analytics to support accurate monitoring of all aspects of daily life includ-
ing physical activity, sleep and activities of daily living (ADLs), based on a service-
oriented middleware [2]. After integrating them in a uniform knowledge representa-
tion format, Dem@Home employs semantic interpretation techniques to infer com-
plex activity recognition from atomic events and highlight clinical problems. Specifi-
cally, it follows a hybrid reasoning scheme, using DL reasoning for activity detection
and SPARQL to extract clinical problems. Utterly, Dem@Home presents information
to applications tailored to clinicians and patients, endorsing technology-aided clinical
interventions to improve care. Dem@Home has been deployed and evaluated in four
home pilots showing optimistic results with respect to accurate fusion and activity
detection and clinical value in care.
The rest of the paper is structured as follows: Section 2 presents relevant work,
while Section 3 gives an overview of the framework. Section 4 elaborates on data
analytics, presenting the activity recognition and problem detection capabilities of
Dem@Home. Section 5 describes the GUIs supported by the framework to provide
feedback to clinical experts or patients, Section 6 presents the evaluation results and
Section 7 concludes the paper.
2 Related Work
Pervasive technology solutions have already been employed in several ambient envi-
ronments, either homes or clinics, but most of them focus on a single domain to moni-
tor, using only a single or a few devices. Such applications include wandering behav-
ior prevention with geolocation devices, monitoring physical activity, sleep, medica-
tion and performance in daily chores [3] [4].
In order to assess cognitive state, activity modelling and recognition appears to be
a critical task, common amongst existing assistive technology. OWL has been widely
used for modelling human activity semantics, reducing complex activity definitions to
the intersection of their constituent parts. In most cases, activity recognition involves
the segmentation of data into snapshots of atomic events, fed to the ontology reasoner
for classification. Time windows [5] and slices [6] provide background knowledge
about the order or duration [7] of activities are common approaches for segmentation.
In this paradigm, ontologies are used to model domain information, whereas rules,
widely embraced to compensate for OWL’s expressive limitations, aggregate activi-
ties, describing the conditions that drive the derivation of complex activities e.g. tem-
poral relations.
Focusing on clinical care through sensing, the work in [8] has deployed infrared
motion sensors in clinics to monitor sleep disturbances, limited, though, to a single
sensor. Similarly, the work in [9] presents a sensor network deployment in nursing
homes in Taiwan to continuously monitor vital signs of patients, using web-based
technologies, verifying the system’s accuracy, acceptance and usefulness. Neverthe-
less, it so far lacks the ability to fuse more sensor modalities such as sleep and ambi-
ent sensing, with limited interoperability.
Other solutions involve smart home deployments of environmental sensors to ob-
serve and assess elder and disabled people activities [10] [11]. The work in [12] moni-
tors the residents’ physical activity and vital signs by using wearable sensors, door
sensors to measure presence and “fully automated biomedical devices” in the bath-
room, while the system presented in [13] provides security monitoring, with actuators
to control doors, windows and curtains, but none of the above records sleep. On the
other hand, Dem@Home offers a unified view of many life aspects, including sleep
and activities, to automatically assess disturbances and their causes, aiding clinical
monitoring and interventions.
3 The Dem@Home Framework
Dem@Home proposes a multidisciplinary approach that brings into effect the synergy
of the latest advances in sensor technologies addressing a multitude of complementary
modalities, large-scale fusion and mining, knowledge representation and intelligent
decision-making support. In detail, as depicted in Fig. 1, the framework integrates
several heterogeneous sensing modalities, such as physical activity and sleep sensor
measurements, combined input from lifestyle sensors and higher-level image analyt-
ics, providing their unanimous semantic representation and interpretation.
The current selection of sensors is comprised of proprietary, low-cost, ambient or
Fig. 1. Dem@Home architecture, sensors and clinical applications
wearable devices, originally intended for lifestyle monitoring, repurposed to a medi-
cal context. Ambient depth cameras
1
are collecting both image and depth data. The
Plug sensors
2
are attached to electronic devices, e.g. to cooking appliances, to collect
power consumption data. Tags
3
are attached to objects of interest, e.g. a drug-box or a
watering can, capturing motion events and Presence sensors are modified Tags that
detect people’s presence in a room using IR motion. A wearable Wristwatch
4
measures physical activity levels in terms of steps, while a pressure-based Sleep sen-
sor
5
is placed underneath the mattress to record sleep duration and interruptions.
Each device is integrated by using dedicated modules that wrap their respective
API, retrieve data and process them accordingly to generate atomic events from sen-
sor observations e.g. through aggregation. In the case of image data, computer vision
techniques are employed to extract information about humans performing activities,
such as opening the fridge, holding a cup or drinking [14]. All atomic events and ob-
servations are mapped to a uniform semantic representation for interoperability and
stored to the system’s Knowledge Base. Dem@Home applies further semantic analy-
sis, activity recognition and detection of problems i.e. anomalies, and then all the
derived information can be used by domain-specific applications offering a tailored
view to different types of users.
4 Activity Recognition and Problem Detection
To obtain a more comprehensive image of an individual’s condition and its progres-
sion, driving clinical interventions, Dem@Home employs semantic interpretation to
perform intelligent fusion and aggregation of atomic, sensor events to complex ones
and identify problematic situations, with a hybrid combination of OWL 2 reasoning
and SPARQL queries.
Dem@Home provides a simple pattern for modelling the context of complex activ-
ities. First of all, sensor observations, including location, posture, object movement
and actions, are integrated with complex activities in a uniform model, as types of
events, extending the leo:Event class of LODE
6
(Fig. 1). The agents of the events
and the temporal context are captured using constructs from DUL
7
and OWL Time
8
,
respectively.
Each activity context is described through class equivalence axioms that link them
with lower-level observations of domain models (Fig. 1). The instantiation of this
pattern is used by the underlying reasoner to classify context instances, generated
during the execution of the protocol, as complex activities. The instantiation involves
1
Xtion Pro - http://www.asus.com/Multimedia/Xtion_PRO/
2
Plugwise sensors - https://www.plugwise.nl/
3
Wireless Sensor Tag System - http://wirelesstag.net/
4
Jawbone UP24 - https://jawbone.com
5
Withings Aura - http://www2.withings.com/us/en/products/aura
6
LODE - http://linkedevents.org/ontology/
7
DUL - http://www.loa.istc.cnr.it/ontologies/DUL.owl
8
OWL Time - http://www.w3.org/TR/owl-time/
linking ADLs with context containment relations through class equivalence axioms.
For example, given that the activity PrepareHotTea involves the observations Turn-
KettleOn, CupMoved, KettleMoved, TeaBagMoved and TurnKettleOff, its semantics
are defined as:
𝑃𝑟𝑒𝑝𝑎𝑟𝑒𝑇𝑒𝑎 ≡ 𝐶𝑜𝑛𝑡𝑒𝑥𝑡 ⊓ ∃𝑐𝑜𝑛𝑡𝑎𝑖𝑛𝑠. 𝑇𝑢𝑟𝑛𝐾𝑒𝑡𝑡𝑙𝑒𝑂𝑛 ⊓ ∃𝑐𝑜𝑛𝑡𝑎𝑖𝑛𝑠. 𝐶𝑢𝑝𝑀𝑜𝑣𝑒𝑑
⊓ ∃𝑐𝑜𝑛𝑡𝑎𝑖𝑛𝑠. 𝐾𝑒𝑡𝑡𝑙𝑒𝑀𝑜𝑣𝑒𝑑 ⊓ ∃𝑐𝑜𝑛𝑡𝑎𝑖𝑛𝑠. 𝑇𝑒𝑎𝐵𝑎𝑔𝑀𝑜𝑣𝑒𝑑
⊓ ∃𝑐𝑜𝑛𝑡𝑎𝑖𝑛𝑠. 𝑇𝑢𝑟𝑛𝐾𝑒𝑡𝑡𝑙𝑒𝑂𝑓𝑓
According to clinical experts involved in the development of Dem@Home, high-
lighting problematic situations next to the entire set of monitored activities and met-
rics would further facilitate and accelerate clinical assessment. Dem@Home uses a set
of predefined rules (expressed in SPARQL) with numerical thresholds that clinicians
can adjust and personalize to each of the individuals in their care, through a GUI.
Furthermore, each analysis is invoked for a period of time allowing different thresh-
olds for different intervals e.g. before and after a clinical intervention. Problematic
situations supported so far regard night sleep (short duration, many interruptions, too
long to fall asleep), physical activity (low daily activity totals), missed activities (e.g.
skipping daily lunch) and reoccurring problems (problems for consecutive days).
5 End-User Assessment Application
At the application level, Dem@Home provides a multitude of user interfaces to assist
both clinical staff, summarizing an individual’s performance and highlighting abnor-
mal situations, and patients, proposing simplified view of measurements and educa-
tional material.
The clinician interface offers four different approaches to monitoring a patient, i.e.
Summary, Comparison, and All Observations, as well as four options of time extent of
the data, i.e. One-Day, Per Day, Per Week and Per Month. In One-Day Summary,
sleep measurements are obtained from one single night and are categorized as Total
Time in Bed but Awake, Total Time Shallow Sleep, Total Time Deep Sleep, Total Time
Asleep, Number of Interruptions and Sleep Latency (Fig. 2). In Summary Per Day, the
Fig. 2. The clinician interface regarding sleep parameters, in One-Day Summary session.
clinician is able to select to a time interval between two dates or a single date, to ob-
serve sleep stages, physical levels and other activities of daily living, derived from
power consumption, moved objects and presence in rooms (Fig. 3). Moreover, the
clinician can set specific thresholds about sleeping problems during the night and a
problems section will be added (bottom of Fig. 3). In Comparison per Day, different
measurements of a particular time period can be combined in the same chart, allowing
the clinician to check how observations affect each other, e.g. how physical activity
affects sleep or how usage of a device affects a daily activity (Fig. 4). Finally, Corre-
lation shows a scatterplot for two types of measurements, while All Observations
shows all collected data in detail. The Per Week/Month options offer the above-
mentioned functionalities summarized per week/month.
On the other hand, patients are introduced to an alternative interface, tailored to
provide easy monitoring of their daily life and simple interaction with the clinicians.
Accessed by a tablet device, a limited view of the most important measurements is
displayed, to avoid overwhelming the users or even stressing them out. The patient
interface presents 3-day information regarding Physical Activity (daily steps and
burned calories), Sleep, Usage of Appliances and Medication. Especially in Sleep
section, patient is notified about how many sleep interruptions they had during the
Fig. 3. Sleep, daily activities and problems in Summary Per Day session.
night. In addition to sensor readings, the patient interface is enhanced with education-
al material, such as recipes or instructions to guide them step-by-step to perform rou-
tine tasks, and the ability to exchange messages between end-users and clinicians.
Overall, the application is explicitly design to help patients feel confident and secure
with the system they are using, but also to encourage social interaction between users
and clinicians.
6 Evaluation
Dem@Home was evaluated in four home installations, in the residences of individu-
als living alone, clinically diagnosed with mild cognitive impairment or mild demen-
tia, and maintained for four months. Sensors and relevant home areas or devices of
the installation (Table 1) were selected after a visit from the clinician to the partici-
pants. The majority of deployed sensors covered the areas of kitchen, bathroom and
bedroom, since these rooms are strongly linked with most daily activities.
Since the framework embodies an interdisciplinary approach, it was evaluated both
from research and clinical perspective. Firstly, we evaluate the effectiveness of activi-
ty recognition through fusion of sensor data and existing multimedia analytics. Sec-
ondly, clinical results vary and add significant value to monitoring and interventions.
For the evaluation of the ontology-based fusion and activity recognition capabili-
ties of Dem@Home, ground truth has been obtained through annotation (performed
once), based on images from ambient cameras. We use the True Positive Rate (TPR)
and Positive Predicted Value (PPV) measures, which denote recall and precision re-
spectively, to evaluate the performance with respect to ADLs recognized as per-
formed. The clinical expert suggested the monitoring of five activities, namely drug
box preparation, cooking, making tea, watching TV and bathroom visit. Table 2 de-
picts the pertinent context dependency models defined.
Fig. 4. Comparison Per Day chart between two activities
Dem@Home’s ADL activity recognition performance has been evaluated on a da-
taset of 31 days, in July 2015. As observed on Table 3, the more atomic and continu-
ous an activity is, the more accurate the detection. BathroomVisit, most accurately
detected, is never interleaved to do something else. On the contrary, cooking is a
long-lasting activity interrupted by instances of other events (e.g. watching TV) and
influenced by uncertainty and the openness of the environment. WatchTV and Pre-
pareTea are fairly short in duration, causing less uncertainty and interleaved events in
between, yielding decent precision and recall rates.
On the other hand, the clinical evaluation of the framework regards its capabilities
and the fulfillment of clinical requirements. With Dem@Home supporting clinical
interventions, significant improvement was found in post-pilot clinical assessment in
multiple domains, such as increase in physical condition and sleep quality, utterly
bringing about positive change in mood and cognitive state, measured objectively by
neuropsychological tests. In detail, the first participant has overcome insomnia, the
lack of exercise and neglecting daily chores. The second participant has shown im-
provement in sleep and mood, while the other two users have been benefited with
respect to sleep and medication.
Table 1. Sensors in home installation
Sensor
Home area or device
Camera
Kitchen, Living room, Hall
Plugs
TV, Iron, Vacuum, Cooking device, Boiler, Kettle, Bathroom lights
Tags
TV remote, Iron, Fridge door, Drug cabinet, Drug box, Tea bag, Cup
Presence
Kitchen, Bathroom, Living room
Wristwatch
User’s arm
Sleep sensor
Bed
Table 2. Context dependency models for the evaluation
Activity Concept
Context dependency set
PrepareDrugBox
DrugBoxMoved, DrugCabinetMoved, KitchenPresence
Cooking
TurnCookerOn, KitchenPresence
PrepareTea
TurnKettleOn, TeaBagMoved, CupMoved, KitchenPresence,
TurnKettleOff
WatchTV
TurnTvOn, RemoteControlMoved, LivingRoomPresence
BathroomVisit
BathroomPresence, TurnBathroomLightsOn
Table 3. Precision and recall for activity recognition
Activity
Recall (TPR)
Precision (PPV)
PrepareDrugBox
0.86
0.89
Cooking
0.61
0.68
PrepareTea
0.81
0.86
WatchTV
0.87
0.80
BathroomVisit
0.91
0.94
7 Conclusion and Future Work
Dem@Home is an ambient assisted living framework integrating a variety of sensors,
analytics and semantic interpretation with a special focus on dementia ambient care.
New, affordable sensors have been integrated seamlessly into the framework, along
with a set of processing components, ranging from sensor to image analytics. All
knowledge is semantically interpreted for further fusion and detection of problematic
behaviours, while tailored user interfaces aim to detailed monitoring and adaptive
interventions. Evaluation of the framework has yielded valuable and optimistic results
with respect to accurate fusion and activity detection and clinical value in care.
Regarding future directions, Dem@Home could be extended for increased porta-
bility and installability. Specifically, establishing an open source, IoT-enabled seman-
tic platform, following the latest advances in board computing would allow the plat-
form to be easily deployed in multiple locations. Combined with the infrastructure to
push the events on a cloud infrastructure, the framework could constitute a powerful
platform for telemedicine and mobile health, combing sensors and sophisticated am-
bient intelligence techniques such as computer vision.
8 Acknowledgement
This work has been supported by the H2020-ICT-645012 project KRISTINA: A
Knowledge-Based Information Agent with Social Competence and Human Interac-
tion Capabilities.
9 References
1. Prince, M., Wimo, A., Guerchet, M., Ali, G., Wu, Y.T., Prina, M.: World Alz-
heimer Report 2015. The global impact of dementia. An analysis of prevalence,
incidence, cost and trends. Alzheimers Dis. Int. Lond. (2015).
2. Stavropoulos, T.G., Meditskos, G., Kontopoulos, E., Kompatsiaris, I.: The
DemaWare Service-Oriented AAL Platform for People with Dementia⋆. Artif.
Intell. Assist. Med. AI-AMNetMed 2014. 11 (2014).
3. Kerssens, C., Kumar, R., Adams, A.E., Knott, C.C., Matalenas, L., Sanford, J.A.,
Rogers, W.A.: Personalized technology to support older adults with and without
cognitive impairment living at home. Am. J. Alzheimers Dis. Other Demen.
1533317514568338 (2015).
4. Dawadi, P.N., Cook, D.J., Schmitter-Edgecombe, M., Parsey, C.: Automated
assessment of cognitive health using smart home technologies. Technol. Health
Care Off. J. Eur. Soc. Eng. Med. 21, 323 (2013).
5. Okeyo, G., Chen, L., Wang, H., Sterritt, R.: Dynamic sensor data segmentation
for real-time knowledge-driven activity recognition. Pervasive Mob. Comput. 10,
155–172 (2014).
6. Riboni, D., Pareschi, L., Radaelli, L., Bettini, C.: Is ontology-based activity
recognition really effective? In: Pervasive Computing and Communications
Workshops. pp. 427–431. IEEE (2011).
7. Patkos, T., Chrysakis, I., Bikakis, A., Plexousakis, D., Antoniou, G.: A reasoning
framework for ambient intelligence. In: Artificial Intelligence: Theories, Models
and Applications. pp. 213–222. Springer (2010).
8. Suzuki, R., Otake, S., Izutsu, T., Yoshida, M., Iwaya, T.: Monitoring daily living
activities of elderly people in a nursing home using an infrared motion-detection
system. Telemed. J. E Health. 12, 146–155 (2006).
9. Chang, Y.-J., Chen, C.-H., Lin, L.-F., Han, R.-P., Huang, W.-T., Lee, G.-C.:
Wireless sensor networks for vital signs monitoring: Application in a nursing
home. Int. J. Distrib. Sens. Netw. 2012, (2012).
10. Helal, S., Mann, W., King, J., Kaddoura, Y., Jansen, E., others: The gator tech
smart house: A programmable pervasive space. Computer. 38, 50–60 (2005).
11. Demongeot, J., Virone, G., Duchêne, F., Benchetrit, G., Hervé, T., Noury, N.,
Rialle, V.: Multi-sensors acquisition, data fusion, knowledge mining and alarm
triggering in health smart homes for elderly people. C. R. Biol. 325, 673–682
(2002).
12. Tamura, T., Togawa, T., Ogawa, M., Yoda, M.: Fully automated health monitor-
ing system in the home. Med. Eng. Phys. 20, 573–579 (1998).
13. Bonner, S.G.: Assisted interactive dwelling house. In: Proc. 3rd TIDE Congress:
Technology for Inclusive Design and Equality Improving the Quality of Life for
the European Citizen. p. 25 (1998).
14. Avgerinakis, K., Briassouli, A., Kompatsiaris, I.: Recognition of Activities of
Daily Living for Smart Home Environments. In: Intelligent Environments (IE),
2013 9th International Conference on. pp. 173–180. IEEE (2013).