Conference PaperPDF Available

Abstract and Figures

A comprehensive multi-sensor monitoring and feedback system is presented, designed to support independent living for elderly people with dementia or other conditions and provide decision support for their formal and informal caregivers. This solution goes significantly beyond existing monitoring and assisted living approaches, which operate with a simple set of sensors and measurements, as it integrates a very heterogeneous set of sensing modalities and technologies, including video, audio analysis, in addition to physiological, environmental and other measurements. Semantic web technologies are used in for the intelligent integration and feedback of the sensor analysis results, also in line with user requirements, dictated by clinicians. This results in relevant feedback and decision support, which is communicated to the end users via appropriately designed user interfaces. A variety of clinical scenarios and environments are supported, from short-duration testing in hospital environments to long-term daily life monitoring and support at home, for independent living.
Content may be subject to copyright.
Multimodal Sensing and Intelligent Fusion for Remote
Dementia Care and Support
Thanos G. Stavropoulos, Georgios Meditskos, Alexia Briassouli, Ioannis Kompatsiaris
Centre for Research & Technology Hellas
Information Technologies Institute
6th Km Charilaou-Thermi
57001 Thessaloniki, Greece
{athstavr, gmeditsk, abria, ikom}@iti.gr
ABSTRACT
A comprehensive multi-sensor monitoring and feedback sys-
tem is presented, designed to support independent living
for elderly people with dementia or other conditions, and
provide decision support for their formal and informal care-
givers. This solution goes significantly beyond existing mon-
itoring and assisted living approaches, which operate with
a simple set of sensors and measurements, as it integrates a
very heterogeneous set of sensing modalities and technolo-
gies, including video, audio analysis, in addition to physi-
ological, environmental and other measurements. Semantic
web technologies are used in for the intelligent integration
and feedback of the sensor analysis results, in line with user
requirements, dictated by clinicians. This results in rele-
vant feedback and decision support, which is communicated
to the end users via appropriately designed user interfaces.
A variety of clinical scenarios and environments are sup-
ported, from short-duration testing in hospital environments
to long-term daily life monitoring and support at home, for
independent living.
Categories and Subject Descriptors
[Human-centered Computing]: Ubiquitous and Mobile
ComputingAmbient Intelligence; [Applied Computing]:
Healthcare Information Systems; [Computing Method-
ologies]: Knowledge Representation and Reasoning
Keywords
ambient assisted living; sensors; semantic web; ontologies;
reasoning; context-awareness; dementia;
1. INTRODUCTION
The ageing population of the world is leading to an in-
crease in chronic conditions associated with old age, such
as dementia. Their social, economical and personal burden
is enormous, as they can prove debilitating for individuals,
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full cita-
tion on the first page. Copyrights for components of this work owned by others than
ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re-
publish, to post on servers or to redistribute to lists, requires prior specific permission
and/or a fee. Request permissions from permissions@acm.org.
MMHealth’16, October 16 2016, Amsterdam, Netherlands
c
2016 ACM. ISBN 978-1-4503-4518-7/16/10. . . $15.00
DOI: http://dx.doi.org/10.1145/2985766.2985776
who can no longer live independently. Assisted living so-
lutions are being designed to help, providing discreet but
relevant monitoring and decision support to elderly people
living alone and their caregivers, who can remotely obtain a
picture of the person’s condition, so as to provide personal-
ized, effective care.
This work presents the Dem@Care system, a comprehen-
sive, multi-sensor monitoring solution for individuals with
dementia, deploying a great variety of sensors to monitor
the environment, their physiological status, overall health
and lifestyle aspects, including sleep, levels of activity, socia-
bility, mood. The heterogeneous sensor data is integrated in
an intelligent manner, resulting in a comprehensive picture
of the person’s current status and its evolution over time,
allowing caregivers to determine the best care approach in
each case. Dem@Care utilizes the DemaWare2 framework
for interoperability in an AAL environment, described in de-
tail in [11]. This system has been tested with success in real
world conditions in Ireland, France, Sweden and Greece, in
hospital and home environments, addressing the user needs
and ethical requirements of each setup. In the future this
solution can be expanded with State of the Art (SoA) sen-
sors and analysis, for timely, accurate outcomes leading to
relevant feedback and optimal care.
Each sensing modality is analyzed separately, and their
results are integrated in a semantically meaningful manner,
in line with user requirements dictated by healthcare pro-
fessionals, the informal caregivers, as well as the patients
themselves. The outcomes can range from reminders (e.g.
if a person has a doctor’s appointment) to more sophisti-
cated advice on lifestyle changes, suggested based on the
multi-sensor measurements, medical expertise and that spe-
cific person’s profile. The technologies used span smarthome
monitoring sensors, such as temperature, humidity, motion
sensors, to wearables for monitoring the person’s physiolog-
ical status. Additionally, new modalities are deployed - and
integrated for the first time - in such a comprehensive man-
ner. Namely, audio recordings of speech are analyzed to
detect early signs of potentially pathological changes, video
monitoring gives a clear picture of lifestyle and changes in
it, particularly concerning the ability to carry out activities
of daily living, and sleep is monitored in a passive, unob-
trusive manner, as it is a rich source of information for an
individual’s health status.
35
2. RELATED WORK
Recent years have seen a great upsurge in the develop-
ment of Ambient intelligence and Ambient Assisted Living
(AAL) frameworks, which have already been deployed in
smart homes for health and lifestyle monitoring, safety and
other applications. The sensor modalities chosen are tai-
lored to the end user needs in each case, with the initial
solutions on the market today1mostly focusing on envi-
ronmental and security-related sensing (light, temperature,
humidity, open/close doors/window etc) [9]. Additionally,
most existing solutions monitor only one domain, using a
single or a few devices. Some applications are geared to-
wards wandering behavior prevention with geolocation de-
vices, assessment of levels of physical activity, sleep, daily
activities [8], [4]. A sensor network deployed in real world
nursing homes in Taiwan [3] is used to continuously moni-
tor vital signs of the inhabitants. However, that system has
limited interoperability, as it cannot fuse additional modali-
ties like environmental or sleep sensors. A platform [10] has
been developed in France for remote monitoring and assis-
tance of elderly, however multiple sensor results are simply
combined, rather than integrated in an intelligent manner
that takes into account expert knowledge (from clinicians),
context, and each end user’s particular needs. Other solu-
tions, like Ubisense2, only examine personal activity levels
and other physiological measurements, but disregard the en-
vironment and context in general.
Numerous research efforts are also taking place to involve
robots in assisted living environments3, however they are of-
ten costly and impractical, so they still need time to become
widely accepted and adopted on a large scale. Robots will
indeed form an integral part of assisted living solutions in
the future [5], [6], [7] but are beyond the scope of this work,
in which we present a passive monitoring and tracking so-
lution for effective and relevant personalized feedback and
decision support4.
The Dem@Care solution offers a unified framework with
knowledge representation for sensor interoperability that mea-
sures multiple aspects of health, context and lifestyle to au-
tomatically assess disturbances and their causes, aiding clin-
ical monitoring and interventions for the provision of person-
alized care recommendations.
3. THE DEM@CARE FRAMEWORK
The central goal of Dem@Care is to help individuals with
dementia live independently alone for as long as possible,
by monitoring aspects of their life considered to be most
important by clinical experts (in this case sleeping, eating,
sociability, levels of activity, mood). The sensors used are
unobtrusive and can provide useful data on these five areas
after appropriate analysis and intelligent fusion, as many of
them provide indirect information on the aspects of a per-
son’s daily life. It should also be noted that the system is
modular, as some of the sensing modalities cannot be used
in all environments or countries, depending on the corre-
sponding ethical regulations.
1Home sensing solutions inlucde: https://www.smartthings.
com/, http://www.awarehome.gatech.edu/
2http://www.doc.ic.ac.uk/vip/ubisense
3Robotic AAL: http://www.radio-project.eu/, http://
accompanyproject.eu/,http://giraffplus.eu/
4The Dem@Care FP7 project - www.demcare.eu
3.1 Dem@Care sensors
The sensing modalities deployed aim to monitor the en-
vironment (temperature, motion/presence, electricity con-
sumption etc), the health status of the individual (via wear-
able fitness trackers, sleep monitors, recognition of daily ac-
tivities from video, speech analysis), either by ambient sen-
sors placed in the house, or wearables. The sensors used are
proprietary low-cost devices (Fig. 1), and include motion
sensors, a wearable fitness tracker, a sleep monitor, motion
sensors, audio, video and smart plugs. A subset of these
sensors is used in each deployment, depending on the envi-
ronment (e.g. in a home or in a hospital), each individual’s
needs and national and international ethical/legal regula-
tions.
Ambient motion sensors monitor entrace/exit from rooms,
giving a picture of a person’s levels of activity, presence in
certain rooms, the presence of others (to detect levels of so-
ciability), and also providing added security, since they can
indicate the presence of an intruder. Object motion sensors,
known as tags, are also used to detect the movements of ob-
jects (e.g. coffeemaker) and determine they are being used.
Smart plugs track the usage of devices related to daily ac-
tivities (e.g. refrigerator, TV, coffee maker), giving a clear
picture of an individual’s habits and lifestyle over time.
Health status can be inferred from wearable fitness track-
ers’ and ambient sleep trackers’ measurements, while changes
in their outputs over a long period of time can serve as valu-
able early signs of improvement or worsening of a person’s
condition. In clinical interview style setups, wearable micro-
phones are used to detect potential early signs of dementia or
deterioration from the analysis of a person’s speech. Video
obtained from ambient color or color-depth cameras is used
to monitor a person’s lifestyle, with a focus on the detection
activities of daily living (ADLs) necessary for independent
living, as dictated by clinical experts. Cameras also serve
security purposes, to prevent or record criminal activity in
the house, such as a robbery, or even help prevent abuse
that could take place in a nursing home.
3.2 Sensor framework and data analysis
The multimodal sensor framework of Dem@Care is re-
alized using the architecture depicted in Fig. 1, with the
following layers: (1) sensors layer comprising of all sensors
and their measurements, (2) analysis layer, for the analysis
of the sensor measurements (3) representation layer, for a
unanimous semantic representation (4) interpretation layer
for the semantic interpretation and fusion, (5) service layer,
corresponding to the web service interface and (6) applica-
tion layer, with domain-specific end-user applications.
The measurement data undergoes different types of anal-
ysis, depending on the sensors. Simpler sensors, e.g. for
physiological monitoring, are processed and interpreted ac-
cording to open data formats or using proprietary libraries,
which result in a directly interpretable outcome. Longer
term monitoring outcomes, e.g. from smartplugs, can be
aggregated over time to detect consumption and lifestyle
patterns. Other sensor data, namely audio and video, re-
quires the development and use of sophisticated statistical
and machine learning algorithms for their analysis and the
extraction of higher level information. The ability to live
independently alone depends on a person’s ability to carry
out activities of daily living (ADLs), which are dictated by
clinicians. These are detected in videos using State of the
36
Figure 1: Dem@Care framework: sensors, analysis
modules, applications
Art (SoA) computer vision algorithms [1], [2], resulting in
indicators of difficulties in daily life, related to dementia. In
particular, people with cognitive difficulties carry out ADLs
more slowly, or after many repetitions: this information can
only be obtained from video, where specific ADLs can be
detected. It should be noted that video is only deployed in
environments where it is allowed, following the current eth-
ical and legal regulations in each country, and always after
informed consent is obtained. Testing takes place in differ-
ent environments, ranging from constrained hospital lab en-
vironments, where individuals are asked to carry out specific
ADLs (Fig. 2), to home monitoring. Most importantly, to
ensure privacy, raw video data is not stored or transmitted,
but is either analyzed online, or in each location of deploy-
ment. Furthermore, only non-identifying encrypted meta-
data is transmitted over secure networks. Finally, in some
setups (mostly in hospital environments) audio recordings of
the individual performing specific speech exercises were ana-
lyzed in the spectral domain. Changes in a person’s speech,
or even certain characteristics (e.g. speaking more slowly,
with many pauses, slurring etc) were shown to be correlated
to their cognitive status and its evolution, providing insights
about early indicators of dementia that cannot be obtained
from other modalities.
3.3 Semantic Fusion and Integration
Intelligent fusion of the sensor analysis outcomes takes
place by first developing a Knowledge Base (KB) that mod-
els domain knowledge. The domain knowledge in this case
concerns clinical aspects, as well as aspects of the person’s
daily life and profile, such as demographics and clinical his-
tory. Measurements and activities are also represented, in-
cluding their location, objects and temporal context. In
real-world applications, activities can appear as a complex
combination of measurement data. This is handled by the
design of activity ontologies to represent relations between
sensor outputs, context, activities, and their classification.
Figure 2: Activity Zones
High-level activity recognition is realized in the Semantic
Interpretation (SI) layer, where the intelligent fusion of sim-
pler activities takes place, leading to a more accurate profile
of the person being monitored. SI implements a location-
driven context generation and classification approach. Each
room is divided into zones, according to the location of each
activity, so wjen a participant enters a zone, the SI gen-
erates a container instance and starts associating it with
collected observations related to that zone. These container
instances actually capture contextual information and are
used to recognize ongoing activities. Temporal aspects of the
measurements and activities being detected are also taken
into account, as they can help further discriminate between
different ADLs. Within Dem@Care, the temporal intervals
of the activities taking place are extracted from the sensor
data, thus providing temporal context.
4. DEPLOYMENT AND EVALUATION
The proposed health monitoring and feedback framework
is deployed in several different environments to demonstrate
its capabilities and assess its usability in real-world appli-
cations. The modular nature of the system allows different
subsets of its components to be deployed in each setup, de-
pending on user needs, trial protocols and so on.
4.1 Lab Deployment
Lab-type deployments usually involve a protocol of ADLs
to be carried out by the individual over a short time period
(about 20 minutes), monitored by the proposed system. The
results are directly communicated to the end users, i.e. doc-
tors and nurses, giving them a richer and objective picture
of the individual. These tests allow for the use of a larger
number of sensors, since their duration is short and they do
not concern a person’s private life. The setup in Fig. 3 con-
tains object motion sensors, to measure object usage, and a
smartplug for the teakettle. Ambient video cameras record
the entire scene, so as to monitor the activities carried out
by the subject and detect how they take place.
The lab implementation results are communicated to end
users by a graphical user interface, which has been designed
to (1) assess the individual’s health status and ability to
carry out daily activities, (2) view the outcomes of the sen-
sor measurements and analysis in an understandable man-
ner. The assessment screen can be seen in Fig. 4, where an
instantiation before the testing is shown: in this case, the
37
Figure 3: Sensor setup and deployment in the lab
Figure 4: Sensor initialization screen for lab setup
users can check the status and activate/deactivate sensors,
according to the current protocol step (upper part) while
monitoring sensor events and connectivity in real-time (bot-
tom part).
A timeline of both complete and incomplete activities,
highlighted in green and red respectively, is also shown,
along with further details for the activities, such as their rel-
ative order, total duration and number of repetitions, which
can all provide valuable insights to clinicians.
4.1.1 Results from Lab Clinical Trials
In a hospital environment deployment with 98 partici-
pants (27 Alzheimer’s Disease - AD, 38 Mild Cognitive Im-
pairment - MCI, 33 Healthy), aged 60-90, it was shown that
the system can successfully conduct lab trials autonomously,
allowing psychologists and other caregivers (nurses, neurol-
ogists and other clinicians) to examine a larger number of
individuals in an objective and reliable manner. Their feed-
back on the usability of the interfaces and the system in
general was very positive and participation in the pilots was
high.
Initial assessments of individuals and new insights on their
conditions have already been obtained from these first trails,
as a statistically significant difference was found in the du-
ration and the number of attempts carried out for ADLs be-
tween people with dementia and healthy individuals. MCI
participants completed the phone call and the account pay-
ment activities, while individuals with AD could not, allow-
ing the system to distinguish them with a mean accuracy of
Figure 5: Lab results screen after testing
73.67%. Similarly, individuals with MCI were able to pay
a bank bill and make a phone call, allowing the system to
distinguish between them with 84% accuracy.
4.2 Home Deployment
Home deployments differ from lab tests as they involve
continuous testing, aiming to support the individuals in their
daily life, while ethical regulations limit the sensors that can
be used in each country. The monitoring results are fused in
an intelligent manner, as before, and communicated to the
end users to support diagnosis, evaluation, care options and
provide personalized feedback to the patient. Fig. 6 pro-
vides a comprehensive overview of the Dem@Care system
deployed in homes, called Dem@Home. It shows the min-
imum set of devices used in homes, after confirming that
they a) fulfill acceptability requirements for the home resi-
dents and b) the quality and quantity of symptom-related
information they provide fulfills the clinical monitoring re-
quirements as set by psychologists. In detail, all devices
are ambient, except a light wearable wristband, and are in-
stalled transparently in the home user’s environment. The
wristband, is barely noticeable, and is found to provide a
feeling of safety and inclusion while being monitored. In
this manner, the Dem@Care setup provides an unobtrusive
comprehensive monitoring solution that results in an accu-
rate and in depth picture of the person’s lifestyle and health
status.
4.2.1 Results from Home Pilots
Pilots also took place at home (Fig. 7), with four users
with MCI and mild dementia, who indeed benefited from
the deployment of the system. The system interface allowed
the identification of difficulties in daily living, such as many
sleep interruptions, neglect of daily tasks like cleaning and
decreased levels of activity.
The caregivers provided the appropriate feedback for each
issue and continued monitoring their progress in that do-
main. They encouraged the patients to walk more and
monitored them through their wrist-worn fitness tracker.
To monitor daily chores, daily activity recognition modules
were deployed via smartplugs, object motion trackers on de-
vices (vacuum cleaner, iron etc), and via computer vision
based detection of ADLs [2]. ADL recognition allowed the
clinician to monitor the individuals’ daily activities through-
out the entire day, and determine that they were adhering to
the suggested interventions and also showing some improve-
38
Figure 6: Dem@Home architecture and information
flow in home deployments
Figure 7: Dem@Home sensor setup
ments. Overall, after monitoring four users for a period of at
least two months each, the system monitoring outcomes and
weekly interviews confirmed that the participants showed an
increase in physical activity, improved sleep and mood.
5. CONCLUSIONS
A comprehensive solution for multimodal monitoring, in-
telligent fusion and personalized decision support for re-
mote care, focusing on the case of dementia, has been pre-
sented. Numerous heterogeneous sensing modalities have
been seamlessly integrated, combining ambient, wearable
and lifestyle sensors. A set of processing components rang-
ing from sensor analytics for event detection to sophisticated
image, video and audio analytics, was integrated and ana-
lyzed. All knowledge is unanimously stored in a knowledge
base, enabling its semantic interpretation for further fusion,
aggregation and detection of problematic behaviours. The
framework was designed for monitoring people with demen-
tia after consulting clinical experts on user requirements,
monitoring and feedback. Real world testing in lab and
home environments led to improvements in the health sta-
tus of individuals living at home and accurate assessments
in the lab, thanks to the reliable monitoring and tailored
feedback.
Future directions from this work include the extension of
both the framework and its clinical applications. The sys-
tem can be extended to a complete solution for real-time
feedback, while mood and stress can also be assessed via the
latest wearable sensors, especially in home settings. While
the framework has been deployed in numerous locations, it
can yet be extended for increased portability and installa-
bility. Combined with the infrastructure to push the events
on a cloud infrastructure, the framework could constitute a
powerful platform for telemedicine and mobile health, comb-
ing sensors and sophisticated ambient intelligence techniques
such as computer vision.
6. ACKNOWLEDGMENTS
This work was funded by the European Commission under
the 7th Framework Program (FP7 2007-2013), grant agree-
ment 288199 Dem@Care.
7. REFERENCES
[1] K. Avgerinakis, A. Briassouli, and I. Kompatsiaris.
Activity detection using sequential statistical
boundary detection (ssbd). Computer Vision and
Image Understanding (CVIU), 144:46.
[2] K. Avgerinakis, A. Briassouli, and I. Kompatsiaris.
Activity detection and recognition of daily living
events. In 1st ACM MM Workshop on Multimedia
Indexing and Information Retrieval for Healthcare
(MIIRH), 2013.
[3] Chang, Y.-J., Chen, C.-H., Lin, L.-F., Han, R.-P.,
Huang, W.-T., Lee, and G.-C. Wireless sensor
networks for vital signs monitoring: Application in a
nursing home. Int. J. Distrib. Sens. Networks,
21(4):323.
[4] P. Dawadi, D. Cook, M. Schmitter-Edgecombe, and
C. Parsey. Automated assessment of cognitive health
using smart home technologies. Journal of Technology
and Health Care, 21(4):323.
[5] F. S. et al. A holistic approach for advancing robots in
ambient assisted living environments. In Embedded
and Ubiquitous Computing (EUC), 2015 IEEE 13th
International Conference on, pages 140–147, 2015.
[6] G. K. et al. Computation and communication
challenges to deploy robots in assisted living
environments. In 2016 Design, Automation and Test
in Europe Conference and Exhibition (DATE), pages
888–893, 2016.
[7] J. P. et al. Results of a real world trial with a mobile
social service robot for older adults. In 2016 11th
ACM/IEEE International Conference on
Human-Robot Interaction (HRI), pages 497–498, 2016.
[8] C. Kerssens, K. R., and A. Adams. Personalized
technology to support older adults with and without
cognitive impairment living at home. Am J Alzheimers
Dis Other Demen., page 85.
[9] M. Morris, B. Adair, K. Miller, E. Ozanne, R. Hansen,
A. J. Pearce, N. Santamaria, L. Viega, M. Long, and
C. Said. Smart-home technologies to assist older
people to live well at home. Journal of aging science,
1(1):1.
[10] N. Noury. Ailisa: Experimental platforms to evaluate
remote care and assistive technologies in gerontology.
In HEALTHCOMs, pages 67–72, 2005.
[11] T. G. Stavropoulos, G. Meditskos, and
I. Kompatsiaris. Demaware2: Integrating sensors,
multimedia and semantic analysis for the ambient care
of dementia. Pervasive and Mobile Computing, 2016.
39
... This paper discussed monitoring health data generated from sensors at home. Another example is Stavropoulos et al.(2016) where the authors report their aim to build a monitoring system to support the elderly who are suffering from dementia. This initiative would allow such patients to have an independent life by providing their care givers with feedback via their measurement data being generated from wearable sensors. ...
... In the literature, various sources and forms of used data were mentioned. For example, Stavropoulos et al.(2016) built a system for monitoring elderly's health by analysing collected data from wearable sensors, while Singh et al. (2013) built a system that analyse medical images and extracts identifiable features from them. Many papers used hospital and patients' records as part of the integarted data as well. ...
... interoperability capabilities. For instance,Stavropoulos et al.(2016) used the SW technologies in integrating heterogeneous sensing data (video and audio) with other physiological and environmental measurements.The second aim in this group is 'treatment and drug recommendation' aim. The nature of this medical aim is very similar to the 'diagnosis' aim in terms of relying on decisions. ...
Thesis
The semantic web (SW) offers tools for supporting data integration and sharing across disparate resources in the web. Meanwhile, health research needs an efficient approach for handling heterogenous data integration for the massive amounts of available health-related data to help discovering new scientific breakthroughs. In this thesis, the current and potential relationships between the semantic web and health research are aimed to be understood and identified through systematically reviewing the literature and examining the SW features in a proof-ofconcept health-related demonstrator. Firstly, a systematic literature review of 447 articles addressing health questions and using the SW standards was conducted to map the literature and identify any gaps or opportunities. The results of the review were analysed in a mixed approach of quantitative and qualitative methods producing two taxonomies: 1) the health aims and 2) the SW features taxonomies. The review revealed the most and least addressed health questions as well as the used SW features in the literature. Secondly, a semantic web-based demonstrator was developed to represent the NHS dispensed prescriptions topic and examine some of the identified SW features. The prescriptions demonstrator consists of three interlinked OWL ontologies: the BNF, NHS and prescriptions ontologies along with their converted RDF instances. Moreover, two health questions, inspired from the traditional health literature and suggested by health experts in a focus group, were translated into SPARQL queries and ran across the ontologies to test more of the SW features. It has been learned that the SW has a potential in supporting health research and accelarting research findings in the areas of: data representaion, data integration and knowledge discovery. However, there are some challenges need resolving for a better result such as: data accessibility, security, quality, heterogeneity and lack of user-friendly tools.
... Some reviewed works cover user acceptance studies of specific projects or finalised systems Stavropoulos et al., 2016;Vaziri et al., 2017 ). Coppini et al. (2017) provides a user acceptance and usability study regarding the wize mirror proposed in Colantonio, Coppini et al. (2015) and Henriquez et al. (2017) . ...
... Coppini et al. (2017) provides a user acceptance and usability study regarding the wize mirror proposed in Colantonio, Coppini et al. (2015) and Henriquez et al. (2017) . Another example is Stavropoulos et al. (2016) present the results of a system called Dem@care , which combines multiple types of sensors, including video and audio, but also wearable physiological signal devices. The system has undergone clinical trials in different countries and therefore it has been validated under several jurisdictions. ...
Article
Providing support for ageing and frail populations to extend their personal autonomy is desirable for their well-being as it is for the society at large, since it can ease the economic and social challenges caused by ever-ageing developed societies. Ambient-assisted living (AAL) technologies and services might be a solution to address those challenges. Recent improved capabilities in both ambient and wearable technologies, especially those related with video and lifelogging data, and huge advances in the accuracy of intelligent systems for AAL are leading to more valuable and trustworthy services for older people and their caregivers. These advances have been particularly relevant in the last years due to the appearance of RGB-D devices and the development of deep learning systems. This article reviews these latest developments in the intersection of AAL, intelligent systems, lifelogging, and computer vision. This paper provides a study of previous reviews in these fields, and later analyses newer intelligent techniques employed with different video-based lifelogging technologies in order to offer lifelogging services for AAL. Additionally, privacy and ethical issues associated with these technologies are discussed. This review aims at facilitating the understanding of the multiple fields involved.
Article
Assistive technology (AT) with context-aware computing and artificial intelligence capabilities can be applied to address cognitive and communication impairments experienced by persons with dementia (PwD). This paper aims to provide an overview of current literature regarding some characteristics of intelligent assistive technology devices (IATDs) for cognitive and communicative impairments of PwD. It also aims to identify the areas of impairment addressed by these IATDs. A multi-faceted systematic search strategy yielded records. Predefined criteria were applied for inclusion and data extraction. Thereafter data was thematically analysed and synthesised. This review demonstrates that almost all of the research involving IATDs has focused on cognitive impairments of PwD and has not yet evolved past the conceptual or prototype stages of development. Summaries of commercially available IATDs for PwD and relevant prototypes are provided at the end of this review. This research concluded that IATDs for PwD targeting cognition and communication problems primarily focus on social robots, and that they address cognitive impairments of attention, affect, and social-pragmatic communicative impairments. Future research endeavours concerning AT for PwD should explore collaboration between computer engineering and health practitioners to address the identified gaps. This may contribute to the available information for evidence-based decision making for PwD.
Article
Full-text available
This paper presents DemaWare2, an Ambient Assisted Living framework to support the care of people with dementia. The framework integrates various sensor modalities, such as ambient, wearable, offline and cloud-based, together with sophisticated, interdisciplinary methods including image, audio and semantic analysis. Fine-grained, atomic events, such as object manipulation, are aggregated into complex activities through semantic fusion. Applications tailored to monitoring dementia symptoms support clinicians to drive effective, timely interventions and evaluate their outcomes. The framework was evaluated for its robustness, reliability and clinical value in real-world lab trials and home installations.
Conference Paper
Full-text available
Activity recognition is one of the most active topics within computer vision. Despite its popularity, its application in real life scenarios is limited because many methods are not entirely automated and consume high computational resources for inferring information. In this work, we contribute two novel algorithms: (a) one for automatic video sequence segmentation - elsewhere referred to as activity spotting or activity detection - and (b) a second one for reducing activity representation computational cost. Two Bag-of-Words (BoW) representation schemas were tested for recognition purposes. A set of experiments was performed, both on publicly available datasets of activities of daily living (ADL), but also on our own ADL dataset with both healthy subjects and people with dementia, in realistic, life-like environments that are more challenging than those of benchmark datasets. Our method is shown to provide results better than, or comparable with, the SoA, while we also contribute a realistic ADL dataset to the community.
Article
Full-text available
This study evaluated the application of a wireless sensor network (WSN) on a web-based vital signs monitoring system to nursing homes in Taiwan. The applicability assessment focused on the timely provision of information, information accuracy, system usability, and system accessibility of healthcare systems using a wireless sensor network. Experiments were performed under Internet-based network conditions to verify the timely information provision, especially for a web-based system, including Ajax technology. The accuracy of the information was verified from statistical analyses of the residents’ daily vital sign measurements. A comparison was performed between having and not having a healthcare monitoring system in nursing homes for system usability, system accessibility, and system efficacy. The results indicate that the successful application of a WSN healthcare monitoring system is feasible for use in nursing homes in Taiwan.
Article
Full-text available
The goal of this work is to develop intelligent systems to monitor the wellbeing of individuals in their home environments.OBJECTIVE: This paper introduces a machine learning-based method to automatically predict activity quality in smart homes and automatically assess cognitive health based on activity quality. This paper describes an automated framework to extract set of features from smart home sensors data that reflects the activity performance or ability of an individual to complete an activity which can be input to machine learning algorithms. Output from learning algorithms including principal component analysis, support vector machine, and logistic regression algorithms are used to quantify activity quality for a complex set of smart home activities and predict cognitive health of participants. Smart home activity data was gathered from volunteer participants (n=263) who performed a complex set of activities in our smart home testbed. We compare our automated activity quality prediction and cognitive health prediction with direct observation scores and health assessment obtained from neuropsychologists. With all samples included, we obtained statistically significant correlation (r=0.54) between direct observation scores and predicted activity quality. Similarly, using a support vector machine classifier, we obtained reasonable classification accuracy (area under the ROC curve=0.80, g-mean=0.73) in classifying participants into two different cognitive classes, dementia and cognitive healthy. The results suggest that it is possible to automatically quantify the task quality of smart home activities and perform limited assessment of the cognitive health of individual if smart home activities are properly chosen and learning algorithms are appropriately trained.
Conference Paper
Full-text available
The AILISA project is promoting experimental platforms in France to evaluate technologies for remote monitoring and assistance to elderly people at home. It will start with assessing a system for remote monitoring of the activity status, a smart shirt for monitoring health parameters, and a robot for transfer assistance. Already 4 platforms have been installed in Paris, Toulouse and Grenoble.
Article
Although persons with dementia (PWD) and their family caregivers need in-home support for common neuropsychiatric symptoms (NPS), few if any assistive technologies are available to help manage NPS. This implementation study tested the feasibility and adoption of a touch screen technology, the Companion, which delivers psychosocial, nondrug interventions to PWD in their home to address individual NPS and needs. Interventions were personalized and delivered in home for a minimum of 3 weeks. Postintervention measures indicated the technology was easy to use, significantly facilitated meaningful and positive engagement, and simplified caregivers' daily lives. Although intervention goals were met, caregivers had high expectations of their loved one's ability to regain independence. Care recipients used the system independently but were limited by cognitive and physical impairments. We conclude the Companion can help manage NPS and offer caregiver respite at home. These data provide important guidance for design and deployment of care technology for the home. © The Author(s) 2015.