Instruments for Evaluating Education in Evidence-Based Practice: A Systematic Review

Department of Medicine, University of Alabama School of Medicine, and Department of Veterans Affairs Medical Center, Birmingham, AL 35233, USA.
JAMA The Journal of the American Medical Association (Impact Factor: 30.39). 10/2006; 296(9):1116-27. DOI: 10.1001/jama.296.9.1116
Source: PubMed

ABSTRACT Evidence-based practice (EBP) is the integration of the best research evidence with patients' values and clinical circumstances in clinical decision making. Teaching of EBP should be evaluated and guided by evidence of its own effectiveness.
To appraise, summarize, and describe currently available EBP teaching evaluation instruments.Data Sources and
We searched the MEDLINE, EMBASE, CINAHL, HAPI, and ERIC databases; reference lists of retrieved articles; EBP Internet sites; and 8 education journals from 1980 through April 2006. For inclusion, studies had to report an instrument evaluating EBP, contain sufficient description to permit analysis, and present quantitative results of administering the instrument.
Two raters independently abstracted information on the development, format, learner levels, evaluation domains, feasibility, reliability, and validity of the EBP evaluation instruments from each article. We defined 3 levels of instruments based on the type, extent, methods, and results of psychometric testing and suitability for different evaluation purposes.
Of 347 articles identified, 115 were included, representing 104 unique instruments. The instruments were most commonly administered to medical students and postgraduate trainees and evaluated EBP skills. Among EBP skills, acquiring evidence and appraising evidence were most commonly evaluated, but newer instruments evaluated asking answerable questions and applying evidence to individual patients. Most behavior instruments measured the performance of EBP steps in practice but newer instruments documented the performance of evidence-based clinical maneuvers or patient-level outcomes. At least 1 type of validity evidence was demonstrated for 53% of instruments, but 3 or more types of validity evidence were established for only 10%. High-quality instruments were identified for evaluating the EBP competence of individual trainees, determining the effectiveness of EBP curricula, and assessing EBP behaviors with objective outcome measures.
Instruments with reasonable validity are available for evaluating some domains of EBP and may be targeted to different evaluation needs. Further development and testing is required to evaluate EBP attitudes, behaviors, and more recently articulated EBP skills.

Download full-text


Available from: Terrence Shaneyfelt, Jul 02, 2015
  • Source
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background: Despite the widespread teaching of evidence-based medicine (EBM) to medical students, the relevant literature has not been synthesized appropriately as to its value and effectiveness. Aim: To systematically review the literature regarding the impact of teaching EBM to medical students on their EBM knowledge, attitudes, skills and behaviors. Methods: MEDLINE, SCOPUS, Web of science, ERIC, CINAHL and Current Controlled Trials up to May 2011 were searched; backward and forward reference checking of included and relevant studies was also carried out. Two investigators independently extracted data and assessed the quality of the studies. Results: 10,111 potential studies were initially found, of which 27 were included in the review. Six studies examined the effect of clinically integrated methods, of which five had a low quality and the other one used no validated assessment tool. Twelve studies evaluated the effects of seminars, workshops and short courses, of which 11 had a low quality and the other one lacked a validated assessment tool. Six studies examined e-learning, of which five having a high or acceptable quality reported e-learning to be as effective as traditional teaching in improving knowledge, attitudes and skills. One robust study found problem-based learning less effective compared to usual teaching. Two studies with high or moderate quality linked multicomponent interventions to improved knowledge and attitudes. No included study assessed the long-term effects of the teaching of EBM. Conclusions: Our findings indicated that some EBM teaching strategies have the potential to improve knowledge, attitudes and skills in undergraduate medical students, but the evidenced base does not demonstrate superiority of one method. There is no evidence demonstrating transfer to clinical practice.
    Medical Teacher 11/2014; 37(1). DOI:10.3109/0142159X.2014.971724 · 2.05 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background There is a lack of appropriate tools for assessing the effectiveness of teaching evidence-based practice in nursing. Objectives The objective of the study was to develop the instrument evaluating the students´ perception of the effectiveness of EBP courses and to verify its psychometric properties. Design A descriptive cross-sectional study design was used to verify psychometric properties of the questionnaire measuring the students´perception of the effectiveness of EBP courses. Participants The psychometric properties were evaluated in a group of 129 graduate nursing students who completed EBP courses. Methods The instrument for measuring the students´ perception of the effectiveness of EBP courses was inspired by Kirkpatrick´s evaluation model, which advocates evaluating interventions at four levels – reaction (satisfaction), learning, behavior change (transfer) and results (benefits). A web-based survey was used for data collection. Data was collected from the middle of January 2013 through the end of March 2013. Results A thirteen item instrument was developed for measuring the students´perception of the effectiveness of EBP courses. The internal consistency of the scale, based on standardized Cronbach´s alpha, was .93, which signifies strong internal consistency. The results of factor analysis identified three factors of the instrument. The highest rated items on a scale of 1 (strongly disagree) to 7 (strongly agree) were 'Implementation of EBP can improve clinical care' (mean 6.16), 'EBP instructors had a thorough knowledge of EBP' (6.13), 'EBP instructors were enthusiastic about teaching EBP' (5.65), 'I can use my EBP knowlede and skills in my practice (5.58). Graduate nursing students perceived themselves as most competent at the following EBP skills: Asking questions regarding patients´ care (mean 6.16), Selecting the best evidence from what is found in a search (5.52), Searching efficiently for evidence that answers clinical questions (5.50). Conclusions The results of testing of the psychometric properties of the questionnaire showed at least satisfactory validity and reliability. The majority of students perceived EBP courses as effective. The instrument may be used to assess the students´ perception of the effectiveness of EBP courses.
    Nurse Education Today 10/2014; 35(1). DOI:10.1016/j.nedt.2014.09.010 · 1.46 Impact Factor