Article

Validation of an instrument to assess evidence-based practice knowledge, attitudes, access, and confidence in the dental environment.

Educational and Faculty Development, Dental School, University of Texas Health Science Center at San Antonio, 7703 Floyd Curl Drive, San Antonio, TX 78229, USA.
Journal of dental education (Impact Factor: 1.04). 02/2011; 75(2):131-44.
Source: PubMed

ABSTRACT This article reports the validation of an assessment instrument designed to measure the outcomes of training in evidence-based practice (EBP) in the context of dentistry. Four EBP dimensions are measured by this instrument: 1) understanding of EBP concepts, 2) attitudes about EBP, 3) evidence-accessing methods, and 4) confidence in critical appraisal. The instrument-the Knowledge, Attitudes, Access, and Confidence Evaluation (KACE)-has four scales, with a total of thirty-five items: EBP knowledge (ten items), EBP attitudes (ten), accessing evidence (nine), and confidence (six). Four elements of validity were assessed: consistency of items within the KACE scales (extent to which items within a scale measure the same dimension), discrimination (capacity to detect differences between individuals with different training or experience), responsiveness (capacity to detect the effects of education on trainees), and test-retest reliability. Internal consistency of scales was assessed by analyzing responses of second-year dental students, dental residents, and dental faculty members using Cronbach coefficient alpha, a statistical measure of reliability. Discriminative validity was assessed by comparing KACE scores for the three groups. Responsiveness was assessed by comparing pre- and post-training responses for dental students and residents. To measure test-retest reliability, the full KACE was completed twice by a class of freshman dental students seventeen days apart, and the knowledge scale was completed twice by sixteen faculty members fourteen days apart. Item-to-scale consistency ranged from 0.21 to 0.78 for knowledge, 0.57 to 0.83 for attitude, 0.70 to 0.84 for accessing evidence, and 0.87 to 0.94 for confidence. For discrimination, ANOVA and post hoc testing by the Tukey-Kramer method revealed significant score differences among students, residents, and faculty members consistent with education and experience levels. For responsiveness to training, dental students and residents demonstrated statistically significant changes, in desired directions, from pre- to post-test. For the student test-retest, Pearson correlations for KACE scales were as follows: knowledge 0.66, attitudes 0.66, accessing evidence 0.74, and confidence 0.76. For the knowledge scale test-retest by faculty members, the Pearson correlation was 0.79. The construct validity of the KACE is equivalent to that of instruments that assess similar EBP dimensions in medicine. Item consistency for the knowledge scale was more variable than for other KACE scales, a finding also reported for medically oriented EBP instruments. We conclude that the KACE has good discriminative validity, responsiveness to training effects, and test-retest reliability.

Full-text

Available from: William D Hendricson, Jun 09, 2015
0 Followers
 · 
303 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Evidence-based practice (EBP) is an essential component of good quality, patient-centered health care. This requires practitioners to acquire EBP skills and knowledge during undergraduate and continuing education. Evidence-based practice education exists in a range of health care disciplines, including optometry. Evidence-based practice education, however, depends on relevant skills and knowledge in educators. Courses and workshops exist for the development of EBP teaching skills in some areas of health care but not in optometry. Here, we describe a pilot workshop designed to enhance the teaching of EBP and to investigate the perspectives of optometric educators on EBP including their attitudes and perceived barriers to EBP and its teaching. Twenty-seven optometric educators including 8 facilitators participated. Of these, 14 were academics (including the 8 facilitators) and 13 were practitioners. Evidence-based practice attitudes were assessed using the Evidence-Based Practice Attitude Scale-50 with appropriate modifications for optometry. Workshop design incorporated strategies to trigger discussion among participants. A nominal group technique was used to identify, prioritize, and reach consensus on barriers to EBP. Although some participants expressed reservations about EBP, a common understanding of the contemporary definition of EBP emerged in educators. Thirty-five barriers to EBP were identified; "time" was selected in the top five barriers by most participants and attracted the highest total score, well above any other barrier (negative attitude to EBP, volume of evidence, integration with clinical practice, and lack of lifelong learning mind-set). Attitudes toward EBP were generally positive and negatively correlated with age and time since graduation, respectively. A group of optometrists and academics new to implementing education in EBP displayed positive attitudes to EBP but considered that its application and teaching could be significantly hindered by a lack of time to access and appraise the large volume of available research evidence in the field of eye care.
    Optometry and vision science: official publication of the American Academy of Optometry 03/2015; 92(4). DOI:10.1097/OPX.0000000000000550 · 2.04 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: It is generally accepted that the more experience a physician or a dentist possess better the quality of health care delivery. However, recent studies had shown that there is in fact an inverse relationship between the number of years of practice and the quality of care provided. Evidence-Based Dentistry (EBD) is a process that restructures the way in which we think about clinical problems. It is an approach to clinical problem solving that has evolved from a self-directed and problem based approach to learning rather than the more traditional didactic form. The American Dental Association’s definition is by far the most comprehensive, as it captures the core elements of EBD and it is namely patient–centered definition. They define it as an approach to oral health care that requires the judicious integration of systematic assessments of clinically relevant scientific evidence, relating to the patient’s oral and medical condition and history, with the dentist’s clinical expertise and the patient’s treatment needs and preferences. This paper outlines this role, together with the advantages and problems of introducing an evidence based approach to dentistry.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Teaching the steps of evidence-based practice (EBP) has become standard curriculum for health professions at both student and professional levels. Determining the best methods for evaluating EBP learning is hampered by a dearth of valid and practical assessment tools and by the absence of guidelines for classifying the purpose of those that exist. Conceived and developed by delegates of the Fifth International Conference of Evidence-Based Health Care Teachers and Developers, the aim of this statement is to provide guidance for purposeful classification and development of tools to assess EBP learning. This paper identifies key principles for designing EBP learning assessment tools, recommends a common taxonomy for new and existing tools, and presents the Classification Rubric for EBP Assessment Tools in Education (CREATE) framework for classifying such tools. Recommendations are provided for developers of EBP learning assessments and priorities are suggested for the types of assessments that are needed. Examples place existing EBP assessments into the CREATE framework to demonstrate how a common taxonomy might facilitate purposeful development and use of EBP learning assessment tools. The widespread adoption of EBP into professional education requires valid and reliable measures of learning. Limited tools exist with established psychometrics. This international consensus statement strives to provide direction for developers of new EBP learning assessment tools and a framework for classifying the purposes of such tools.
    BMC Medical Education 10/2011; 11:78. DOI:10.1186/1472-6920-11-78 · 1.41 Impact Factor