Quality of Deaf and Hard-of-Hearing Mobile Apps: Evaluation Using the Mobile App Rating Scale (MARS) With Additional Criteria From a Content Expert

Article (PDF Available) · October 2019with 32 Reads 
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
DOI: 10.2196/14198
Cite this publication
Abstract
Background: The spread of technology and dissemination of knowledge across the World Wide Web has prompted the development of apps for American Sign Language (ASL) translation, interpretation, and syntax recognition. There is limited literature regarding the quality, effectiveness, and appropriateness of mobile health (mHealth) apps for the deaf and hard-of-hearing (DHOH) that pose to aid the DHOH in their everyday communication and activities. Other than the star-rating system with minimal comments regarding quality, the evaluation metrics used to rate mobile apps are commonly subjective. Objective: This study aimed to evaluate the quality and effectiveness of DHOH apps using a standardized scale. In addition, it also aimed to identify content-specific criteria to improve the evaluation process by using a content expert, and to use the content expert to more accurately evaluate apps and features supporting the DHOH. Methods: A list of potential apps for evaluation was generated after a preliminary screening for apps related to the DHOH. Inclusion and exclusion criteria were developed to refine the master list of apps. The study modified a standardized rating scale with additional content-specific criteria applicable to the DHOH population for app evaluation. This was accomplished by including a DHOH content expert in the design of content-specific criteria. Results: The results indicate a clear distinction in Mobile App Rating Scale (MARS) scores among apps within the study's three app categories: ASL translators (highest score=3.72), speech-to-text (highest score=3.6), and hard-of-hearing assistants (highest score=3.90). Of the 217 apps obtained from the search criteria, 21 apps met the inclusion and exclusion criteria. Furthermore, the limited consideration for measures specific to the target population along with a high app turnover rate suggests opportunities for improved app effectiveness and evaluation. Conclusions: As more mHealth apps enter the market for the DHOH population, more criteria-based evaluation is needed to ensure the safety and appropriateness of the apps for the intended users. Evaluation of population-specific mHealth apps can benefit from content-specific measurement criteria developed by a content expert in the field.
Advertisement
Original Paper
Modifying the Mobile App Rating Scale With a Content Expert:
Evaluation Study of Deaf and Hard-of-Hearing Apps
Ryan Lee Romero*, BPH; Frederick Kates*, PhD, MBA; Mark Hart*, EdD; Amanda Ojeda, BS; Itai Meirom, BS;
Stephen Hardy*, MEd
College of Public Health and Health Professions, University of Florida, Gainesville, FL, United States
*these authors contributed equally
Corresponding Author:
Ryan Lee Romero, BPH
College of Public Health and Health Professions
University of Florida
1225 Center Drive
Gainesville, FL
United States
Phone: 1 (352) 273 6060
Email: ryan.romero1000@ufl.edu
Abstract
Background: The spread of technology and dissemination of knowledge across the World Wide Web has prompted the
development of apps for American Sign Language (ASL) translation, interpretation, and syntax recognition. There is limited
literature regarding the quality, effectiveness, and appropriateness of mobile health (mHealth) apps for the deaf and hard-of-hearing
(DHOH) that pose to aid the DHOH in their everyday communication and activities. Other than the star-rating system with
minimal comments regarding quality, the evaluation metrics used to rate mobile apps are commonly subjective.
Objective: This study aimed to evaluate the quality and effectiveness of DHOH apps using a standardized scale. In addition, it
also aimed to identify content-specific criteria to improve the evaluation process by using a content expert, and to use the content
expert to more accurately evaluate apps and features supporting the DHOH.
Methods: A list of potential apps for evaluation was generated after a preliminary screening for apps related to the DHOH.
Inclusion and exclusion criteria were developed to refine the master list of apps. The study modified a standardized rating scale
with additional content-specific criteria applicable to the DHOH population for app evaluation. This was accomplished by including
a DHOH content expert in the design of content-specific criteria.
Results: The results indicate a clear distinction in Mobile App Rating Scale scores among apps within the study’s 3 app categories:
ASL translators (highest score=3.72), speech-to-text (highest score=3.6), and hard-of-hearing assistants (highest score=3.90). Of
the 217 apps obtained from the search criteria, 21 apps met the inclusion and exclusion criteria. Furthermore, the limited
consideration for measures specific to the target population along with a high app turnover rate suggests opportunities for improved
app effectiveness and evaluation.
Conclusions: As more mHealth apps enter the market for the DHOH population, more criteria-based evaluation is needed to
ensure the safety and appropriateness of the apps for the intended users. Evaluation of population-specific mHealth apps can
benefit from content-specific measurement criteria developed by a content expert in the field.
(JMIR Mhealth Uhealth 2019;7(10):e14198) doi: 10.2196/14198
KEYWORDS
eHealth; mobile health; mHealth; mobile app; hearing; deaf persons; sign language
JMIR Mhealth Uhealth 2019 | vol. 7 | iss. 10 | e14198 | p. 1http://mhealth.jmir.org/2019/10/e14198/ (page number not for citation purposes)
Romero et alJMIR MHEALTH AND UHEALTH
XSL
FO
RenderX
Introduction
Background
The continuous growth of mobile phone technology opens new
opportunities for mobile health (mHealth) apps. The improved
computing capability, increased memory, and the open operating
systems of smartphones support mHealth app development and
help shape the future of health care [1,2]. In 2018, there were
approximately 205.4 billion mobile apps downloaded
worldwide, with a forecasted growth to 258.2 billion by 2022
[3]. This emerging technology can create a more inclusive and
accessible environment for the deaf and hard-of-hearing
(DHOH) population, representing more than 7 million
Americans or over 2% of the US population [4,5]. New mHealth
apps leveraging the internet can help to reduce barriers for
individuals with disabilities, which is a crucial component of
the Americans with Disabilities Act [6]. Disability and health
inclusion strategies include identifying and eliminating
communication barriers for people with hearing impairments
[7]. Universal accessibility of apps that can increase the
multidirectional communication between DHOH persons and
those who are not is essential for social inclusion [8].
Although there are mHealth apps available to the DHOH
population, there is minimal information regarding the quality,
features, effectiveness, and maintenance of these apps.
Information regarding the status of DHOH apps should be
expanded; this information will give consumers valuable
knowledge relevant to choosing an app. In addition, information
on specific user needs could assist developers in recognizing
features desired by the population that have not been fulfilled
with the present mobile apps. Relying on star ratings and reviews
may be insufficient for app developers and analysts because of
the volume of ratings and the usefulness of the information [9].
To evaluate and rank apps relevant to the DHOH in a
quantitative manner, a standardized scale is needed, such as the
Mobile App Rating Scale (MARS) by Stoyanov et al [10]. The
MARS is an evaluation system divided into 5 core sections that
can accommodate a wide variety of mHealth apps.
The deaf population can be divided into groups based on
different criteria such as the degree of hearing loss, language
preference, educational experience, and integration in the Deaf
community or the hearing population [11]. Throughout, the
study will use DHOH to indicate the DHOH population in the
United States. Within this population, there is a distinction to
be made between persons identifying as Deaf (uppercase D)
and deaf (lowercase D). Those who identify as Deaf are actively
engaged in a common Deaf culture and the identity behind the
culture and prefer to use American Sign Language (ASL) or
use only ASL [12]. Persons identifying as deaf or
hard-of-hearing may comprise those who have postlingual
hearing loss, prefer to use English over ASL, or choose to
associate with the hearing culture [13]. Owing to the unique
characteristics of the DHOH population in the United States
and the complexity of the ASL [14], there is no commonly used
written system for ASL; Web-based text is needed for updating
content or simple user queries [15]. In the United States, more
than 500,000 individuals use sign language as their primary
mode of communication [16]. ASL interpreters are commonly
employed to resolve communication barriers between the DHOH
and others [17]. The Bureau of Labor Statistics projects 18%
growth for the industry from 2016 to 2026, which is much faster
than the average for all occupations. The United States is
expected to add 12,100 new positions by 2026 [18]. However,
it can take years to become fluent in ASL [19], and in many
areas of the country, there is a shortage of sign language
interpreters [17,20]. Although signing interpreters are a staple
of interpersonal communication for the DHOH community,
mobile apps can also bolster interpersonal communication in
numerous ways [21,22].
Objective
The aim of this study was to modify a standardized scale to
evaluate and rank apps designed for the DHOH. The study
sought the use of a content expert to develop content-specific
criteria that would gauge the presence of features relevant to
the target population.
Methods
Search and Selection Criteria
An initial screening of the DHOH apps was conducted in May
2018 across 2 mobile app stores: Apple’s App Store and the
Google Play Store (for Android platform). The initial search
criteria sought apps aimed toward assisting the DHOH
community in the United States.
The following search terms were used: “deaf,” “deaf
application,” “deaf hearing,” “hard of hearing,” “sign language
translator,” “sign language,” “sign language applications,” “ASL
translator,” ASL sign language,” “sign language dictionary,”
“sign language keyboard,” “text to speech,” “hearing assistant,”
“hearing aid,” “deaf text to speech,” “deaf translator,” and “ASL
assistant.”
A list of potential apps to review was generated after a
preliminary screening was done based on the app description
in the stores and supporting screenshots to establish relevance
and initial inclusion. In-store reviews and star ratings were not
considered to prevent rater bias. The study was IRB201802567
exempt.
Inclusion Criteria of Apps and Processes
To best select a list of apps, inclusion and exclusion criteria
were developed for the master list of apps with the goals of the
study in mind, which were (1) to determine the objective quality
of hard-of-hearing apps on the mobile market and (2) to gauge
the affinity of these apps to integrate the hard-of-hearing
population into the national community via the elimination of
a social barrier. As the 2 main resource pools for target apps
were the Apple’s App Store and the Google Play Store, only
apps from these 2 repositories were considered.
Initial Apps Excluded From the Study
Apps from both stores (n=217) underwent an initial filtering to
obtain a diverse spread of apps related to the DHOH community
based on the search criteria. Apps not directly related to the
DHOH were not considered further, which include apps related
JMIR Mhealth Uhealth 2019 | vol. 7 | iss. 10 | e14198 | p. 2http://mhealth.jmir.org/2019/10/e14198/ (page number not for citation purposes)
Romero et alJMIR MHEALTH AND UHEALTH
XSL
FO
RenderX
to music or musical instruction (n=39); apps related to sound
or pitch detection (n=9); apps related to handwriting or
signatures (n=16); games or apps intended for gaming purposes
(n=37); apps not otherwise related to the DHOH, or contained
keywords related to DHOH (such as apps with “sound,
“hearing,” “hard to hear,” or other hearing-related keywords
within the app title) but did not include DHOH features (n=48);
and apps using a sign language other than ASL (eg, British Sign
Language or French Sign Language; n=4; see Figure 1).
Figure 1. Inclusion and exclusion criteria flowchart. ASL: American Sign Language; DHOH: deaf and hard-of-hearing; MARS: Mobile App Rating
Scale.
Considered Apps Excluded From the Study
After the initial filtering, apps were screened for compatibility
with the 3 categories of the study (ASL translators,
speech-to-text, and hearing assistants; n=64). Apps having a
primary focus other than the aid of the DHOH population were
excluded in this step. Apps designed for children (n=10),
religious purposes (n=2), or mental health topics (n=2) were
not considered for inclusion criteria screening (see Figure 1).
Final Apps Excluded From the Study
Remaining apps (n=50) were then assessed against the inclusion
and exclusion criteria. Apps meeting 4 final exclusion criteria
were not considered for evaluation with the MARS: apps
including assistive features relevant to other special populations
(apps with assistive features intended for other populations were
excluded to standardize the scoring of the study’s
content-specific measures; n=11); apps requiring external
devices or a subscription for function (n=6); apps in beta test,
or having features such as coming soon or otherwise incomplete
(n=8); and apps representing an older version of an app being
considered (n=4; see Figure 1).
The use of mHealth apps typically falls under the definition of
assistive technology (AT), which can be a piece of equipment,
software program, or product used to increase, maintain, or
improve the functional capabilities of persons with disabilities
[23]. Therefore, the study includes a content expert (SH) to
evaluate the MARS criteria and to create criteria to better match
particular assistive apps to specific needs. Owing to the
complexity of evaluating ASL apps, particularly for a person
who has no hearing difficulties or ASL experience, it was
evident that a content expert was needed.
Apps
The development of a multicategory master list of apps allowed
for an independent evaluation of each category of apps with the
JMIR Mhealth Uhealth 2019 | vol. 7 | iss. 10 | e14198 | p. 3http://mhealth.jmir.org/2019/10/e14198/ (page number not for citation purposes)
Romero et alJMIR MHEALTH AND UHEALTH
XSL
FO
RenderX
MARS scale and developed content-specific measures (see Figure 2).
Figure 2. Master list of apps. ASL: American Sign Language.
American Sign Language Translators
Apps providing translational functionality to and from ASL
were considered in this section. Statistically, “congenital hearing
loss affects two to three infants per 1,000 live births” [24]. As
ASL is the main component of communication for many Deaf
persons in the United States [25], in addition to the unique
accessibility needs of Deaf persons regarding communication
[11], it was appropriate to allocate an entire section of study to
this app category. ASL is unique from verbal language as it
provides a mode of communication based on symbols and visual
cues [25]. In addition, those who are prelingually deafened (and
who typically associate with the Deaf culture) are more reliant
on ASL-based communication than verbal or written English
[26]. Apps that serve to translate sign language to and from
JMIR Mhealth Uhealth 2019 | vol. 7 | iss. 10 | e14198 | p. 4http://mhealth.jmir.org/2019/10/e14198/ (page number not for citation purposes)
Romero et alJMIR MHEALTH AND UHEALTH
XSL
FO
RenderX
words, whether it be by visual interpretation or ASL dictionary
function, were sorted into this category.
Speech-to-Text
Threaded speech-to-text apps are also helping individuals with
hearing loss to take part in conversations with family and groups
of people. The best hard-of-hearing lip-readers can understand
approximately 30% of a dialogue [27]. With threaded
speech-to-text apps, the accuracy of conversation increases to
around 80% to 90% [28]. Speech-to-text allows for a more
conversational-speed communication between the
hard-of-hearing and others. In the current app market, there are
a substantial number of these apps competing to be used, yet
many are not geared toward the hard-of-hearing population.
Although this category of apps poses great benefit to those who
were postlingually deafened or are hard of hearing,
speech-to-text translators may not be particularly useful toward
individuals who were prelingually deafened, stemming from a
lower level of English usage and comprehension [29].
Hearing Assistants
The study’s third category included apps that sought to improve
user communication and interaction in everyday life. This
section was created to give the study the opportunity to analyze
apps that did not fit into the other 2 categories yet provided
some feature that positively affected a DHOH person’s ability
to communicate. A content expert was used to determine
whether or not an app provided some benefit to a DHOH person
in social function or public navigation.
Data Collection and the Mobile App Rating Scale
Decision was made to use a reliable and flexible app quality
rating scale designed by expert research panelists to assess the
quality of mHealth apps via multiple descriptive factors [10].
The MARS was developed to satisfy the need for a reliable and
objective scale that could rate the degree to which mHealth apps
satisfy the defined quality criteria [10]. The study used the 4
MARS classifications, plus an additional content-specific
section: engagement, functionality, aesthetics, and information
quality. The subjective quality section of the MARS was
substituted for the custom-designed content-specific measures
section to better gauge app quality and appropriateness. With
the inclusion of the study’s 4 custom-designed criteria, apps
were evaluated on 23 metrics. All the metrics were quantified
by assigning integer values: 1=poor, 2=fair, 3=acceptable,
4=good, and 5=excellent.
The mobile apps using the MARS were analyzed by 3 members
of the research team (RR, IM, and AO), designated as raters.
Each rater underwent 2 training sessions to correctly attribute
scores from the MARS to the apps. These training sessions were
supplemented by a video developed by the MARS creators to
aid in the rater training and calibration [30]. The MARS training
video is a reference created by the authors of the MARS to
explain the purpose of the scale, the app characteristics that
each subsection of the scale measures, and the guidelines for
evaluating each subsection. The video describes the relative
quality of each app characteristic that is appropriate for each
score (poor-excellent), along with examples of quality features.
Furthermore, the MARS training video acted as a knowledge
base during the construction of content-specific measures and
generated ideas as to how content-specific criteria may be
scored. These ideas aided the refinement of the content-specific
questions after development with the content expert. The 3 raters
are members of the College of Public Health who are involved
in hard-of-hearing research. The raters were all trained with
knowledge from the content expert to give a shared and equal
understanding of the needs and characteristics of the DHOH
population. A total of 3 control apps were used to determine
the intraclass correlation coefficients among raters based on
Shrout and Fleiss’ guidelines [31]. A test similar to Fleiss’
kappa, the Krippendorff alpha, was used to test for interrater
agreement [32]. An alpha of .91 was obtained. All issues
regarding agreeability were discussed among team members
and reevaluated for appropriate concurrence.
Use of a Content Expert and “Content-Specific”
Measures
Content experts can be useful when determining the relevance
of a proposed study to a specific population or subject [33].
Specifically, the study revolves around the needs, desires, and
attributes of the DHOH population. The study’s content expert
(SH) is a professor of ASL who himself is deaf. The Deaf culture
is a broad network of beliefs, values, and rules for behaviors
that are incorporated into the lives of many with hearing
difficulties [12]. The use of a content expert allowed the study
design to consider the needs and desires of this population as
related to mobile app features so that valid and meaningful
metrics could be designed to test for the presence of such
features.
An additional scoring section, the content-specific section, was
tailored with a content expert (SH) for hearing-specific apps
following the MARS authors’ interest in criteria applicable to
specific populations [10]. This section included 4 subfactors:
signing space, distractions, assistive features, and societal
integration potential. To help create the content-specific
classifications and improve validity, a content expert with
experience in teaching ASL and ASL linguistics and Deaf
studies was added to the study. After the initial review of apps,
concerns were discussed with the content expert to further refine
the criteria. All the category factors were quantified by assigning
integer values: 1=poor, 2=fair, 3=acceptable, 4=good, and
5=excellent.
Signing Space
Evaluating the signing space is an important measurement
criterion as this area gauges the digital interpreter’s use of sign
language. In sign language, the signing space encompasses the
distinctive locations surrounding the signer [34], particularly
the space in front of the signer extending from the waist to the
forehead [35]. The content expert emphasized that signing in
apps should be in a consistent space and easy to follow. The
content expert developed several questions with measurable
criteria to assist the evaluators in scoring apps: Does the signing
stay within approximately 12 inches of the body’s center mass?
Does the reader have to concentrate on entirely different areas
when facial expressions are used? Is the signing too fast or too
slow?
JMIR Mhealth Uhealth 2019 | vol. 7 | iss. 10 | e14198 | p. 5http://mhealth.jmir.org/2019/10/e14198/ (page number not for citation purposes)
Romero et alJMIR MHEALTH AND UHEALTH
XSL
FO
RenderX
Distractions
Distraction evaluation includes any attribute or behavior by the
app or interpreter that may be distracting to the reader. These
include distracting clothing (with patterns or multiple colors),
bright nail polish or jewelry, and extraneous events occurring
in the background or the background color [15,36]. Furthermore,
signers should consider the contrast between their skin
complexion and their clothing and the background to focus the
attention around the hands [37].
Assistive Features
The initial purpose of hard-of-hearing apps and translators is
to allow those with hearing difficulties to achieve social function
similar to those who are not hard of hearing. One way the apps
can achieve this is by integrating the assistive features within
their menus and the graphical user interfaces (GUIs). These
assistive features can be (but are not limited to) enlarged buttons
and text, a magnifier function, closed captioning on app
dialogue, and slow scrolling text of a large font style.
Societal Integration
One desirable function of apps that seek to assist the DHOH
and similar populations would be to provide functions that
further aid in the user’s inclusion and experience in society. The
nature of certain apps may lead them to have a higher affinity
for this criterion, but overall, this section can be rated by
reviewing the potential for a particular app to integrate a DHOH
person into society (via social outings, dating, or simply acting
as an aid in public).
Results
App Turnover
One outcome not expected was the high level of app turnover
noted in the ASL translator section. For the study’s purposes,
app turnover has been defined as an app being added to the
master list of apps for evaluation but later being wholly
inaccessible or unusable because of its removal from software
repositories, extremely poor design, or abandonment by the
developer. A total of 5 of the 7 apps were lost to app turnover:
ASL Provider, APEX VRI, Interpret Live, Fluency Mobile, and
ASL Coach. Although the study was left with a reasonable
distribution of apps to evaluate, it should be noted that the high
level of app turnover serves as a general reflection of the state
of hard-of-hearing apps.
Scoring for American Sign Language Translators
Upon completion of scoring for the ASL translator apps, the
consensus among the 3 evaluators was that this category of apps
had quality apps, yet it was subject to a large amount of app
turnover, as previously discussed. Of the 7 ASL translator apps
selected for evaluation, only 2 were accessible or even
discoverable in either app repositories. The best app rated in
the category was Signily (3.72/5), with the second best being
The ASL App (3.642/5). Both apps had acceptable to good
general attributes and acceptable content-specific qualities (see
Table 1).
Scoring for Speech-to-Text
Of the 6 apps tested in the speech-to-text category, all 6 apps
were still accessible at the end of the scoring period. The best
app in the category was Text to Speech! by Gwyn Durbridge
(3.595/5), followed by Speak4Me–Text to Speech (3.584/5)
and Smart Record: Audio Recorder (3.48/5). This category of
apps had the lowest average score for all 4 general attribute
categories, and the lowest average score in the content-specific
measures section (see Table 1). However, because these apps
were not explicitly designed for the DHOH population, they
should only be faulted if drastic improvements toward the target
population’s needs would be easily achievable.
Scoring for Hearing Assistants
Of the 8 apps tested in the hearing assistant category, all 8 apps
were still accessible at the end of the scoring period. The best
app in the hearing assistant category was ListenClear (3.90/5),
followed by Petralex Hearing Aid (3.89/5) and Sound Alert
(3.73). Of the 3 categories evaluated, consensus was reached
that this particular category boasted the most consistent level
of engagement across apps (see Table 1). In addition, the
average functionality score was the highest in this category.
Although the quality of the general attributes of these apps is
either acceptable or good, several of the apps had only fair or
poor scores for the content-specific measures (see Table 1).
There was a notable variance among raters in the
content-specific section while evaluating the app ListenClear.
Although the ListenClear app has an exceptional GUI, assistive
features, and minimal distractions, it would be desirable for this
app to have increased societal integration features considering
how it is marketed to the DHOH population (as a hearing
assistant).
JMIR Mhealth Uhealth 2019 | vol. 7 | iss. 10 | e14198 | p. 6http://mhealth.jmir.org/2019/10/e14198/ (page number not for citation purposes)
Romero et alJMIR MHEALTH AND UHEALTH
XSL
FO
RenderX
Table 1. Mobile App Rating Scale app quality ratings.
Content-specific criteriaa
Information qualityAestheticsFunctionalityEngagementMobile health app ranking
ASLbtranslators
3.93.83.23.83.8Signily
3.03.83.94.43.0The ASL App
DHOHcassistants
2.54.04.74.34.1ListenClear
3.43.93.94.33.9Petralex Hearing Aid
3.03.54.14.33.8Sound Alert
3.13.63.74.14.2BeWarned–App for DHOH
2.03.63.24.13.3Visual Hearing Aid
2.23.13.03.82.7eyeHear
2.42.92.63.82.8Live Caption
1.82.42.03.32.9Earfy
Speech-to-text
3.23.53.63.83.9Text to Speech!
3.03.43.44.13.9Speak4Me–Text to Speech
1.04.44.13.83.1Smart Record: Audio Recorder
2.13.23.43.33.1Transcribe–Speech to Text
1.52.42.72.82.6TextToSpeech (Iconic Solutions, LLC)
1.32.11.91.71.9Text to Speech for iMessage
aContent-specific criteria are based on the interest in an app-specific section by the original Mobile App Rating Scale authors to evaluate apps for
specific populations.
bASL: American Sign Language.
cDHOH: deaf and hard-of-hearing.
Discussion
The State of Hard-of-Hearing Assistance Technology
There is both a need and demand for continuing development
of Android and iOS app support for the DHOH. This study’s
findings agree with previous findings that app developers prefer
Android and iOS for their projects [38]. Of the 217 apps from
the search criteria, 50 apps were assessed against the inclusion
and exclusion criteria for the study. Given the relatively low
yield of candidate apps from the App Store and Google Play
Store, it is thought necessary for more development of apps
targeted toward the DHOH population. A refinement of the
master list of apps to 21 apps based on the inclusion and
exclusion criteria was thought sufficient to mitigate any
anomalies in the study. When it came time to evaluate the apps,
however, it was observed that 5 of the 7 hard-of-hearing
assistants had become unavailable or had otherwise disappeared
from the app stores. This suggests a high rate of app turnover
in the subject field, which may contribute to the limited
availability of quality DHOH apps. It may be inferred that
DHOH apps are experiencing high turnover rates, consistent
with the current AT turnover rates as high as 75% to 80%
[39,40]. It seems apparent that although these apps may have
been designed by skilled developers, features that may be basic
necessities to some persons in the DHOH population are being
overlooked or poorly implemented. Furthermore, the relative
lack of high-level features, such as an app’s ability to act as a
social advocate for the user, underlines the state of the DHOH
apps: available, but limited in scope.
The Value of Hard-of-Hearing Assistance Technology
One of the tenets of the study was to determine the quality of
the available DHOH apps so that it may be inferred on their
value to DHOH persons. Given the current state of mobile
assistance technology for the intended population, it is
reasonable to say that these apps provide augmentation to a deaf
or hard-of-hearing person’s ability to navigate in public, interact
both publicly and with family, and be connected to other deaf
persons through the availability of features that facilitate
interpersonal interaction. It is notable, however, that there should
be more research into other purposes for these apps. For
example, there may be a useful purpose for these kinds of apps
in both domestic and emergency scenarios such as hurricanes
or severe storms. Indeed, the development of emergency AT
could help to reduce the disproportionately high level of
morbidity among the DHOH during natural disasters [41].
Mobile apps or devices could have a preprogrammed
functionality that assists DHOH persons in an emergency,
although this warrants further research.
JMIR Mhealth Uhealth 2019 | vol. 7 | iss. 10 | e14198 | p. 7http://mhealth.jmir.org/2019/10/e14198/ (page number not for citation purposes)
Romero et alJMIR MHEALTH AND UHEALTH
XSL
FO
RenderX
Use of a Content Expert to Evaluate Mobile Apps
One measure seeking to validate the study’s design and metrics
was the consultation of a content expert (SH) who could act as
a representative of the DHOH community with the intent of
gauging the quality and applicability of DHOH apps. As a deaf
individual, he was able to advocate for the needs of the target
population from a position of membership; the study gained
insight into the needs of the population along with reasonable
knowledge to design content-specific measures included in the
MARS. This proved invaluable in guiding the direction of the
study to focus not only on the features of the apps that would
be representative of a quality app but also on those that could
augment the interpersonal and social capability of the user.
Strengths and Limitations
The study leverages the MARS, which is considered a valid
and reliable scale for evaluating mHealth apps. The study
introduces a novel MARS modification to create content-specific
criteria using a content expert to address the needs of intended
end users who will use the apps. One of the limitations of this
review was that the search was limited to US app stores. This
limitation might have restricted the results and the quality of
the apps for the DHOH, particularly if other countries are further
ahead in the development of DHOH apps. Only DHOH apps
that are publicly available were included, which could have
excluded apps developed by a specific health care network that
focuses on the DHOH population. A relatively small sample of
apps was included in the study. The entire pool of apps for the
DHOH was not large to begin with, which may be representative
of the DHOH population being a smaller segment of the entire
population. There was a high rate of DHOH app turnover, which
might be due to the information quality, or that the app is not
meeting the needs of the DHOH population.
Future Research
The authors of this study suggest continuing research into the
use of the modified MARS scale with content-specific
(app-specific) criteria developed with content experts to evaluate
mHealth apps for different populations. Further research into
the use of content experts while designing a study, and their
effect on the validity of subject-specific content will elucidate
more information about the potential benefits of using content
experts to design content-specific metrics. Additional research
is needed for both mHealth apps and DHOH apps to establish
additional criteria to measure information quality in terms of
patient safety and privacy. Being able to measure the risk of
patient safety is a growing concern as some mHealth apps do
not follow evidence-based guidelines, or the app developers
have little or no medical training [42,43]. Other research
opportunities would be to design some app evaluation criteria
that can aggregate different types of apps with some beneficial
DHOH features (not directly intended for DHOH users). For
example, in this study, the initial search terms identified apps
such as audio analyzers and sound detectors, which were
excluded as they did not have DHOH features. However, 1
sound detection app, eezySoundDetector, recognized the
potential to assist the DHOH by providing flashing lights or
vibrating alarms for a fire or a crying child. There were other
types of apps that were excluded for only having minimal
DHOH features, yet these might make a difference for the
DHOH in a predominantly hearing world. Developing an
evaluation tool that can aggregate a wider variety of apps with
different assistive features could be a significant contribution
to the development of mHealth apps. In addition, adding a
qualitative component to studies that survey user preferences
and feedback on the usability of specific mHealth apps has
shown to be effective. A preliminary study of a sound detection
algorithm that used a training and feedback survey from the
DHOH got firsthand insights about detection preferences, such
as running water in the home or a printer in the work
environment, which improve app usability [44]. Finally, this
study can be expanded with a follow-up review on the highest
ranked DHOH apps to access effectiveness and tangible gains
identified by the users, as well as to measure the DHOH app
turnover.
Conclusions
The MARS remains a high-fidelity tool for app evaluation. The
study emphasizes the value of including a content expert early
in the mHealth app development process as well as the
evaluation process to improve effectiveness and to assist in
making criteria-based recommendations to end users. The focus
on mHealth apps for the DHOH population illustrates the
importance of including a related content expert. For someone
who has a hearing difficulty, it can be both reassuring and
empowering to see mHealth app developers, evaluators, and
providers, who recommend these products, value the
understanding of the needs of the intended users. This can be
particularly true for an individual with hearing difficulties who
struggles with his or her identity in a world set in an
overwhelmingly hearing context.
Conflicts of Interest
None declared.
References
1. Sapci A, Sapci H. Digital continuous healthcare and disruptive medical technologies: m-Health and telemedicine skills
training for data-driven healthcare. J Telemed Telecare 2018 Aug 22:1357633X18793293. [doi: 10.1177/1357633X18793293]
[Medline: 30134779]
2. Terry M. Medical apps for smartphones. Telemed J E Health 2010;16(1):17-22. [doi: 10.1089/tmj.2010.9999] [Medline:
20070172]
3. Statista. Number of Mobile App Downloads Worldwide From 2016 to 2018URL: https://www.statista.com/statistics/271644/
worldwide-free-and-paid-mobile-app-store-downloads/ [accessed 2019-02-23]
JMIR Mhealth Uhealth 2019 | vol. 7 | iss. 10 | e14198 | p. 8http://mhealth.jmir.org/2019/10/e14198/ (page number not for citation purposes)
Romero et alJMIR MHEALTH AND UHEALTH
XSL
FO
RenderX
4. Census. Americans With Disabilities: 2010URL: https://www.census.gov/library/publications/2012/demo/p70-131.html
[accessed 2018-12-12]
5. Carroll DD, Courtney-Long EA, Stevens AC, Sloan ML, Lullo C, Visser SN, Centers for Disease Control and Prevention
(CDC). Vital signs: disability and physical activity--United States, 2009-2012. MMWR Morb Mortal Wkly Rep 2014 May
9;63(18):407-413 [FREE Full text] [Medline: 24807240]
6. Americans with Disabilities Act. URL: https://www.ada.gov/ [accessed 2019-02-23]
7. Centers for Disease Control and Prevention. Disability and Health Inclusion StrategiesURL: https://www.cdc.gov/ncbddd/
disabilityandhealth/disability-strategies.html [accessed 2018-12-12]
8. Jaeger PT, Xie B. Developing online community accessibility guidelines for persons with disabilities and older adults. J
Disabil Policy Stud 2008 Oct 17;20(1):55-63. [doi: 10.1177/1044207308325997]
9. Maalej W, Kurtanovi Z, Nabil H, Stanik C. On the automatic classification of app reviews. Requir Eng 2016;21(3):311-331.
[doi: 10.1007/s00766-016-0251-9]
10. Stoyanov SR, Hides L, Kavanagh DJ, Zelenko O, Tjondronegoro D, Mani M. Mobile app rating scale: a new tool for
assessing the quality of health mobile apps. JMIR Mhealth Uhealth 2015 Mar 11;3(1):e27 [FREE Full text] [doi:
10.2196/mhealth.3422] [Medline: 25760773]
11. Reagan T. Toward an 'archeology of deafness': Etic and Emic constructions of identity in conflict. J Lang Identity Educ
2002 Jan 2;1(1):41-66. [doi: 10.1207/S15327701JLIE0101_4]
12. Jones M. Deafness as culture: a psychosocial perspective. Disabil Stud Q 2002 Apr 15;22(2). [doi: 10.18061/dsq.v22i2.344]
13. National Association of the Deaf. Community and Culture – Frequently Asked QuestionsURL: https://www.nad.org/
resources/american-sign-language/community-and-culture-frequently-asked-questions/ [accessed 2019-07-06]
14. Morgan G. On language acquisition in speech and sign: development of combinatorial structure in both modalities. Front
Psychol 2014;5:1217 [FREE Full text] [doi: 10.3389/fpsyg.2014.01217] [Medline: 25426085]
15. Kacorri H, Huenerfauth M, Ebling S, Patel K, Menzies K, Willard M. Regression analysis of demographic and
technology-experience factors influencing acceptance of sign language animation. ACM Trans Access Comput
2017;10(1):1-33. [doi: 10.1145/3046787]
16. Mitchell RE, Young TA, Bachleda B, Karchmer MA. How many people use ASL in the United States? Why estimates
need updating. Sign Lang Stud 2006;6(3):306-335. [doi: 10.1353/sls.2006.0019]
17. McKee MM, Winters PC, Sen A, Zazove P, Fiscella K. Emergency department utilization among deaf American sign
language users. Disabil Health J 2015 Oct;8(4):573-578 [FREE Full text] [doi: 10.1016/j.dhjo.2015.05.004] [Medline:
26166160]
18. Bureau of Labor and Statistics. Interpreters and Translators-Job OutlookURL: https://www.bls.gov/ooh/
media-and-communication/interpreters-and-translators.htm#tab-6 [accessed 2019-02-23]
19. HandSpeak. How Long Does It Take to Learn ASL?URL: https://www.handspeak.com/learn/index.php?id=61 [accessed
2019-02-23]
20. National Association of the Deaf. URL: https://www.nad.org/ [accessed 2019-02-23]
21. Pang H. How does time spent on WeChat bolster subjective well-being through social integration and social capital? Telemat
Inform 2018 Dec;35(8):2147-2156. [doi: 10.1016/j.tele.2018.07.015]
22. Chan M. Mobile phones and the good life: Examining the relationships among mobile use, social capital and subjective
well-being. New Media Soc 2015;17(1):96-113. [doi: 10.1177/1461444813516836]
23. Assistive Technology Industry Association. URL: https://www.atia.org/ [accessed 2018-05-30]
24. Gaffney M, Eichwald J, Grosse S, Mason C. Identifying infants with hearing loss-United States, 1999-2007. Morb Mortal
Wkly Rep 2010;59(8):220-223 [FREE Full text]
25. The National Institute on Deafness and Other Communication Disorders. 2015. American Sign LanguageURL: https://www.
nidcd.nih.gov/health/american-sign-language [accessed 2019-02-23]
26. Dickson M, Magowan R, Magowan R. Meeting deaf patients' communication needs. Nurs Times 2014;110(49):12-15.
[Medline: 26016132]
27. Gannon C. The deaf community and sexuality education. Sex Disabil 1998;16(4):283-293. [doi: 10.1023/A:1023067828478]
28. Ahmed T, Wahid M, Habib M. Implementation of Bangla Speech Recognition in Voice Input Speech Output (VISO)
Calculator. In: Proceedings of the 2018 International Conference on Bangla Speech and Language Processing. 2018 Presented
at: ICBSLP'18; September 21-22, 2018; Sylhet, Bangladesh. [doi: 10.1109/icbslp.2018.8554773]
29. Strong M, Prinz PM. A study of the relationship between American sign language and English literacy. J Deaf Stud Deaf
Educ 1997;2(1):37-46. [doi: 10.1093/oxfordjournals.deafed.a014308] [Medline: 15579834]
30. YouTube. 2018. MARS training videoURL: https://www.youtube.com/watch?v=25vBwJQIOcE [accessed 2018-04-26]
31. Shrout PE, Fleiss JL. Intraclass correlations: uses in assessing rater reliability. Psychol Bull 1979 Mar;86(2):420-428. [doi:
10.1037//0033-2909.86.2.420] [Medline: 18839484]
32. Krippendorff K. University of Pennsylvania ScholarlyCommons. 2011. Computing Krippendorff 's Alpha-ReliabilityURL:
https://repository.upenn.edu/cgi/viewcontent.cgi?article=1043&context=asc_papers [accessed 2019-10-01]
33. Polit DF, Beck CT. The content validity index: are you sure you know what's being reported? Critique and recommendations.
Res Nurs Health 2006 Oct;29(5):489-497. [doi: 10.1002/nur.20147] [Medline: 16977646]
JMIR Mhealth Uhealth 2019 | vol. 7 | iss. 10 | e14198 | p. 9http://mhealth.jmir.org/2019/10/e14198/ (page number not for citation purposes)
Romero et alJMIR MHEALTH AND UHEALTH
XSL
FO
RenderX
34. Liddell S. Grammar, Gesture, And Meaning In American Sign Language. Cambridge: Cambridge University Press; 2003.
35. Emmorey K, Damasio H, McCullough S, Grabowski T, Ponto LL, Hichwa RD, et al. Neural systems underlying spatial
language in American Sign Language. Neuroimage 2002 Oct;17(2):812-824. [doi: 10.1006/nimg.2002.1187] [Medline:
12377156]
36. Sunjoto M, Unwin J. UCL Discovery. 2017. Traffic signage conspicuityURL: http://discovery.ucl.ac.uk/1570053/1/
Unwin_LJ%20July_Aug%20p30-35%20new-ilovepdf-compressed.pdf [accessed 2019-10-01]
37. Pisharady PK, Vadakkepat P, Loh AP. Attention based detection and recognition of hand postures against complex
backgrounds. Int J Comput Vis 2013;101(3):403-419. [doi: 10.1007/s11263-012-0560-5]
38. Martínez-Pérez B, de la Torre-Díez I, López-Coronado M. Mobile health applications for the most prevalent conditions by
the World Health Organization: review and analysis. J Med Internet Res 2013 Jun 14;15(6):e120 [FREE Full text] [doi:
10.2196/jmir.2600] [Medline: 23770578]
39. Sharpe M. CORE. 2010. Assistive Technology Attrition: Identifying Why Teachers Abandon Assistive TechnologiesURL:
https://core.ac.uk/download/pdf/51097930.pdf [accessed 2019-10-01]
40. Verza R, Carvalho ML, Battaglia MA, Uccelli MM. An interdisciplinary approach to evaluating the need for assistive
technology reduces equipment abandonment. Mult Scler 2006 Feb;12(1):88-93. [doi: 10.1191/1352458506ms1233oa]
[Medline: 16459724]
41. Kamau PW, Ivey SL, Griese SE, Qari SH. Preparedness training programs for working with deaf and hard of hearing
communities and older adults: lessons learned from key informants and literature assessments. Disaster Med Public Health
Prep 2018 Oct;12(5):606-614. [doi: 10.1017/dmp.2017.117] [Medline: 29041996]
42. Ferrero NA, Morrell DS, Burkhart CN. Skin scan: a demonstration of the need for FDA regulation of medical apps on
iPhone. J Am Acad Dermatol 2013 Mar;68(3):515-516. [doi: 10.1016/j.jaad.2012.10.045] [Medline: 23394920]
43. Huckvale K, Car M, Morrison C, Car J. Apps for asthma self-management: a systematic assessment of content and tools.
BMC Med 2012 Nov 22;10:144 [FREE Full text] [doi: 10.1186/1741-7015-10-144] [Medline: 23171675]
44. Bragg D, Huynh N, Ladner R. A Personalizable Mobile Sound Detector App Design for Deaf and Hard-of-Hearing Users.
In: Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility. 2016 Presented
at: ASSETS'16; October 23-26, 2016; Reno, Nevada, USA p. 3-13. [doi: 10.1145/2982142.2982171]
Abbreviations
ASL: American Sign Language
AT: assistive technology
DHOH: deaf and hard-of-hearing
GUI: graphical user interface
MARS: Mobile App Rating Scale
mHealth: mobile health
Edited by G Eysenbach; submitted 29.03.19; peer-reviewed by R Wolfe, R Dewey; comments to author 15.06.19; revised version
received 25.07.19; accepted 18.08.19; published 25.10.19
Please cite as:
Romero RL, Kates F, Hart M, Ojeda A, Meirom I, Hardy S
Modifying the Mobile App Rating Scale With a Content Expert: Evaluation Study of Deaf and Hard-of-Hearing Apps
JMIR Mhealth Uhealth 2019;7(10):e14198
URL: http://mhealth.jmir.org/2019/10/e14198/
doi: 10.2196/14198
PMID:
©Ryan Lee Romero, Frederick Kates, Mark Hart, Amanda Ojeda, Itai Meirom, Stephen Hardy. Originally published in JMIR
Mhealth and Uhealth (http://mhealth.jmir.org), 31.10.2019. This is an open-access article distributed under the terms of the
Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution,
and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The
complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and
license information must be included.
JMIR Mhealth Uhealth 2019 | vol. 7 | iss. 10 | e14198 | p. 10http://mhealth.jmir.org/2019/10/e14198/ (page number not for citation purposes)
Romero et alJMIR MHEALTH AND UHEALTH
XSL
FO
RenderX
  • ... To evaluate the apps, the researchers used the Mobile App Rating Scale (MARS) [17]. The scale was used by similar other recently published studies that evaluated the healthrelated apps on both the Apple App and the Google Play stores [18][19][20][21][22]. The MARS systematically assess the app's quality via 23 questions based on expert judgment [17]. ...
    Article
    Background: Mental disorders are a major public health problem leading to premature mortality, homelessness, addiction problems, poor physical health, and suicide. The prevalence of mental disorders in Arab countries, is high. The proliferation and ubiquity of smartphones and their apps in the Arab world may be the long-awaited for digital therapeutic for mental health disorders. However, the evidence about the availability and characteristics of mental health apps available to Arabic speakers remains poor. Objective: To conduct a systematic assessment of the features of depression and anxiety mobile apps available for Arabic speakers. Methods: A critical review of all the currently available depression and anxiety apps, available free of charge to Arabic speakers. The apps are evaluated using the Mobile App Rating Scale (MARS). Further, a categorization of apps' main functions, inspired by the mhGAP guidelines, is developed to classify the apps based on their main functions. Results: A total of 23 apps are identified with far more apps available on the Google Play Store (n = 21) versus only two apps on the iOS App Store. The majority of the apps (n = 16) provide general information about either anxiety, depression, or both. Six apps are of spiritual nature mainly referring to the Islamic faith and the Holy Quran, with one app referring to the Christian faith. Another five apps provide advice on alternative treatments, mainly concerning herbal medicine recipes. Only two apps provided utilities for users, specifically about medication reminders. Conclusions: Mental health digi-threaputics have huge potentials to transform mental health care delivery. However, more empirical studies are needed to assure their quality and efficacy. The results of this study clearly highlight the current gaps to address the needs of Arabic speakers; only 23 apps were identified in this study, mostly with low app quality scores. There is a need to involve expert healthcare professionals in the development of mental health apps and for healthcare providers to empower patients through discussing apps that are useful and discern them from those that can potentially cause harm.
  • Article
    Introduction: Disruptive medical technologies, wearable devices and new diagnostic solutions have been shaping the future of healthcare, and the health informatics skills gap has become a major problem for technology-centric healthcare applications. This study evaluated the relationships between a specific practical skills training method and students' confidence in using wireless monitoring devices along with the attitude towards technology adoption. Methods: Six practical exercises were developed to provide health informatics technical skills to transfer medical information and display multi-channel biological signals. Two hundred and six undergraduate nursing students received a telemedicine and homecare training course. Their familiarity with various data formats and likelihood to recommend telemedicine and remote monitoring applications were measured. Results: The skills training session changed students' attitudes towards remote patient monitoring, and the majority of students provided positive feedback about their confidence in using wireless monitoring devices after the training session. Students stated their plans to use the technology when they start practising and to educate their patients to promote the use of telemedicine. Conclusion: We propose a skills training framework that covers (a) telemedicine, (b) m-Health and connected health, (c) health informatics application development, (d) health informatics device innovation, and (e) data science.
  • Article
    Although WeChat has recently spawned significant resolutions in technology-mediated social contact and interpersonal communication, the research regarding the social and psychological impacts of the newly emerging technology is relatively few. The primary purpose of the current empirical research is to unearth whether and how WeChat interaction could enhance overseas students’ sense of subjective well-being by concentrating on social integration, bridging relationships, and bonding relationships. Using web-based data of 228 Chinese subjects, the obtained results reveal that the time spent on WeChat significantly and directly impacts users’ subjective well-being. Additionally, the findings demonstrate that social integration, bridging relationships, bonding relationships are all significant predictors to subjective well-being. Furthermore, the perceptions of social integration and social capital could play the crucial mediating roles in the connection between WeChat use and the dependent variable of subjective well-being. Therefore, these outcomes may shed light on a more nuanced comprehending of the influence of the new social media interaction on sojourner’s social adaption and overall life quality in the digital age.
  • Article
    Objectives The objectives of this study were to (1) identify available training programs for emergency response personnel and public health professionals on addressing the needs of Deaf and hard of hearing individuals and older adults, (2) identify strategies to improve these training programs, and (3) identify gaps in available training programs and make recommendations for addressing these gaps. Methods A literature review was conducted to identify relevant training programs and identify lessons learned. Interviews were conducted by telephone or email with key informants who were subject matter experts who worked with Deaf and hard of hearing persons (n=11) and older adults (n=11). Results From the literature, 11 training programs targeting public health professionals and emergency response personnel serving Deaf and hard of hearing individuals (n=7) and older adults (n=4) were identified. The 4 training programs focused on older adults had corresponding evaluations published in the literature. Three (43%) of the 7 training programs focused on Deaf and hard of hearing persons included individuals from the affected communities in the development and implementation of the training. Key informant interviews identified common recommendations for improving training programs: (1) training should involve collaboration across different emergency, state, federal, and advocacy agencies; (2) training should involve members of affected communities; (3) training should be more widely accessible and affordable; and (4) training should teach response personnel varied communication techniques relevant to the Deaf and hard of hearing and older adult communities. Conclusions Developing effective, accessible, and affordable training programs for emergency response personnel working with Deaf and hard of hearing persons, some of whom belong to the older adult population, will require a collaborative effort among emergency response agencies, public health organizations, and members of the affected communities. ( Disaster Med Public Health Preparedness . 2017;page 1 of 9)
  • Article
    Full-text available
    Software for automating the creation of linguistically accurate and natural-looking animations of American Sign Language (ASL) could increase information accessibility for many people who are deaf. As compared to recording and updating videos of human ASL signers, technology for automatically producing animation from an easy-to-update script would make maintaining ASL content on websites more efficient. Most sign language animation researchers evaluate their systems by collecting subjective judgments and comprehension-question responses from deaf participants. Through a survey (N = 62) and multiple-regression analysis, we identified relationships between (a) demographic and technology-experience characteristics of participants and (b) the subjective and objective scores collected from them during the evaluation of sign language animation systems. These relationships were experimentally verified in a subsequent user study with 57 participants, which demonstrated that specific subpopulations have higher comprehension or subjective scores when viewing sign language animations in an evaluation study. This finding indicates that researchers should collect and report a set of specific characteristics about participants in any publications describing evaluation studies of their technology, a practice that is not yet currently standard among researchers working in this field. In addition to investigating this relationship between participant characteristics and study results, we have also released our survey questions in ASL and English that can be used to measure these participant characteristics, to encourage reporting of such data in future studies. Such reporting would enable researchers in the field to better interpret and compare results between studies with different participant pools.
  • Conference Paper
    Sounds provide informative signals about the world around us. In situations where non-auditory cues are inaccessible, it can be useful for deaf and hard-of-hearing people to be notified about sounds. Through a survey, we explored which sounds are of interest to deaf and hard-of-hearing people, and which means of notification are appropriate. Motivated by these findings, we designed a mobile phone app that alerts deaf and hard-of-hearing people to sounds they care about. The app uses training examples of personally relevant sounds recorded by the user to learn a model of those sounds. It then screens the incoming audio stream from the phone's microphone for those sounds. When it detects a sound, it alerts the user by vibrating and providing a pop-up notification. To evaluate the interface design independent of sound detection errors, we ran a Wizard-of-Oz user study, and found that the app design successfully facilitated deaf and hard-of-hearing users recording training examples. We also explored the viability of a basic machine learning algorithm for sound detection.
  • Article
    App stores like Google Play and Apple AppStore have over 3 million apps covering nearly every kind of software and service. Billions of users regularly download, use, and review these apps. Recent studies have shown that reviews written by the users represent a rich source of information for the app vendors and the developers, as they include information about bugs, ideas for new features, or documentation of released features. The majority of the reviews, however, is rather non-informative just praising the app and repeating to the star ratings in words. This paper introduces several probabilistic techniques to classify app reviews into four types: bug reports, feature requests, user experiences, and text ratings. For this, we use review metadata such as the star rating and the tense, as well as, text classification, natural language processing, and sentiment analysis techniques. We conducted a series of experiments to compare the accuracy of the techniques and compared them with simple string matching. We found that metadata alone results in a poor classification accuracy. When combined with simple text classification and natural language preprocessing of the text—particularly with bigrams and lemmatization—the classification precision for all review types got up to 88–92 % and the recall up to 90–99 %. Multiple binary classifiers outperformed single multiclass classifiers. Our results inspired the design of a review analytics tool, which should help app vendors and developers deal with the large amount of reviews, filter critical reviews, and assign them to the appropriate stakeholders. We describe the tool main features and summarize nine interviews with practitioners on how review analytics tools including ours could be used in practice.
  • Article
    Effective communication between nurses and patients is a vital part of safe and effective nursing care. However, few health professionals receive training in how to communicate with Deaf people; as a result, attempts to communicate with Deaf patients is often inappropriate and undertaken without knowledge or understanding of their communication needs. This article examines the literature on ways in which Deaf patients experience communicating with, and receive care from, nurses.