Article

Co-Creation With TickiT: Designing and Evaluating a Clinical eHealth Platform for Youth.

BC Children's Hospital, Division of Adolescent Medicine, University of British Columbia, Vancouver, BC, Canada. .
JMIR research protocols 01/2013; 2(2):e42. DOI: 10.2196/resprot.2865
Source: PubMed

ABSTRACT All youth are susceptible to mental health issues and engaging in risky behavior, and for youth with chronic health conditions, the consequences can be more significant than in their healthy peers. Standardized paper-based questionnaires are recommended by the American Academy of Pediatrics in community practice to screen for health risks. In hospitals, psychosocial screening is traditionally undertaken using the Home Education, Eating, Activities, Drugs, Depression, Sex, Safety (HEEADDSS) interview. However, time constraints and patient/provider discomfort reduce implementation. We report findings from an eHealth initiative undertaken to improve uptake of psychosocial screening among youth.
Youth are sophisticated "technology natives." Our objective was to leverage youth's comfort with technology, creating a youth-friendly interactive mobile eHealth psychosocial screening tool, TickiT. Patients enter data into the mobile application prior to a clinician visit. Response data is recorded in a report, which generates alerts for clinicians, shifting the clinical focus from collecting information to focused management. Design goals included improving the patient experience, improving efficiency through electronic patient based data entry, and supporting the collection of aggregated data for research.
This paper describes the iterative design and evaluation processes undertaken to develop TickiT including co-creation processes, and a pilot study utilizing mixed qualitative and quantitative methods. A collaborative industry/academic partnership engaged stakeholders (youth, health care providers, and administrators) in the co-creation development process. An independent descriptive study conducted in 2 Canadian pediatric teaching hospitals evaluated the feasibility of the platform in both inpatient and ambulatory clinical settings, evaluating both providers and patient responses to the platform.
The independent pilot feasibility study included 80 adolescents, 12-18 years, and 38 medical staff-residents, inpatient and outpatient pediatricians, and surgeons. Youth uptake was 99% (79/80), and survey completion 99% (78/79; 90 questions). Youth found it easy to understand (92%, 72/78), easy to use (92%, 72/78), and efficient (80%, 63/79 with completion rate < 10 minutes). Residents were most positive about the application and surgeons were least positive. All inpatient providers obtained new patient information.
Co-creative design methodology with stakeholders was effective for informing design and development processes to leverage effective eHealth opportunities. Continuing stakeholder engagement has further fostered platform development. The platform has the potential to meet IHI Triple Aim goals. Clinical adaptation requires planning, training, and support for health care providers to adjust their practices.

0 Bookmarks
 · 
45 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: eHealth interventions appear and change so quickly that they challenge the way we conduct research. By the time a randomized trial of a new intervention is published, technological improvements and clinical discoveries may make the intervention dated and unappealing. This and the spate of health-related apps and websites may lead consumers, patients, and caregivers to use interventions that lack evidence of efficacy. This paper aims to offer strategies for increasing the speed and usefulness of eHealth research. The paper describes two types of strategies based on the authors' own research and the research literature: those that improve the efficiency of eHealth research, and those that improve its quality. Efficiency strategies include: (1) think small: conduct small studies that can target discrete but significant questions and thereby speed knowledge acquisition; (2) use efficient designs: use such methods as fractional-factorial and quasi-experimental designs and surrogate endpoints, and experimentally modify and evaluate interventions and delivery systems already in use; (3) study universals: focus on timeless behavioral, psychological, and cognitive principles and systems; (4) anticipate the next big thing: listen to voices outside normal practice and connect different perspectives for new insights; (5) improve information delivery systems: researchers should apply their communications expertise to enhance inter-researcher communication, which could synergistically accelerate progress and capitalize upon the availability of "big data"; and (6) develop models, including mediators and moderators: valid models are remarkably generative, and tests of moderation and mediation should elucidate boundary conditions of effects and treatment mechanisms. Quality strategies include: (1) continuous quality improvement: researchers need to borrow engineering practices such as the continuous enhancement of interventions to incorporate clinical and technological progress; (2) help consumers identify quality: consumers, clinicians, and others all need to easily identify quality, suggesting the need to efficiently and publicly index intervention quality; (3) reduce the costs of care: concern with health care costs can drive intervention adoption and use and lead to novel intervention effects (eg, reduced falls in the elderly); and (4) deeply understand users: a rigorous evaluation of the consumer's needs is a key starting point for intervention development. The challenges of distinguishing and distributing scientifically validated interventions are formidable. The strategies described are meant to spur discussion and further thinking, which are important, given the potential of eHealth interventions to help patients and families.
    Journal of Medical Internet Research 01/2014; 16(2):e36. · 4.67 Impact Factor