Article

Objective structured clinical/practical examination (OSCE/OSPE).

Dept of Surgery, Jawaharlal Institute of Postgraduate Medical Education and Research, Pondicherry, India.
Journal of Postgraduate Medicine (Impact Factor: 1.26). 39(2):82-4.
Source: PubMed
6 Bookmarks
 · 
558 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Introduction: Combination of didactic lecture, practical demonstration and performing experiments by students is followed in medicine, dentistry and bachelor of pharmaceutical sciences. The purpose of this study was to assess the attitudes of nursing students towards practical demonstration in physiology. Material and Methods: Seventy three nursing students of the first year underwent practical demonstration of Physiology experiments. Students indicated their agreement or disagreement with the 8 items by ticking one of the five alternative responses. Mean attitude scores were calculated for each item and for the total scale. Results: The overall mean attitude score of 3.76 was towards the favourable side. Eighty seven percent of students agreed that practical demonstration reinforces concepts. Eighty nine percent of students found practical demonstration is a good form of learning experience. Conclusion: The introduction of practical demonstration in addition to didactic lectures may help the students in understanding concepts in Physiology.
    Journal of clinical and diagnostic research : JCDR. 09/2013; 7(9):1989-91.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background. The Objective Structured Practical Examination (OSPE) is widely recognised as one of the more objective methods of assessing practical skills in healthcare programmes, including undergraduate physiotherapy curricula. Objectives. To obtain feedback from both students and staff who were involved in the introduction of an OSPE in 2011, in order to refine and standardise the format throughout the curriculum. Methods. A qualitative research design was used. Data were gathered through a questionnaire with semi-structured open-ended items and focus group discussion. Participants were all third-year undergraduate physiotherapy students (N=47) and all staff members (N=10) in the 2011 academic year who were exposed to the OSPE format or were involved in the first OSPE. Results. The main concerns raised by both students and staff were: (i) pressure due to time constraints and how this might affect student performance; and (ii) the question of objectivity during the assessment. However, their initial concerns changed as they experienced the OSPE in a more positive manner owing to the structure and objectivity of the process of implementing the OSPE. Conclusion. While both students and staff reported positive experiences, the challenges that emerged provided valuable insight in terms of refining the OSPE format in this undergraduate physiotherapy department. AJHPE 2013;5(2):72-74. DOI:10.7196/AJHPE.218 Research Assessment of clinical competence is an essential component of health professions education, requiring educators to make informed decisions that measure students' clinical knowledge and skills accurately. Such clinical assessments have often been challenged by a lack of objectivity. The Objective Structured Clinical Examination (OSCE) was originally developed in Dundee in the mid-1970s [1] with the aim of assessing clinical competence in an objective, structured way. Bartfay et al. [2] and Major [3] highlighted the fact that using the OSCE format introduces standardisation that aims to improve objectivity in assessment. When using the OSCE method, clinical competencies are assessed as students move through a number of 'stations' where they are individually graded using precise criteria in the form of a checklist. The term Objective Structured Practical Examination (OSPE) was derived from the OSCE in 1975, when it was modified to include practical examination. [4] The OSPE, like the OSCE, tests students' ability to perform a practical skill rather than what they know. However, while the OSCE focuses on assessing clinical competence, the OSPE is designed to assess competence in performing a practical skill outside the clinical context. The OSPE has several distinct advantages over other forms of practical assessment, including the fact that it can be used as a summative assessment to evaluate individuals' performance in the practical skills component of the module, as well as for formative evaluation where the student gets feedback as part of the learning process. In addition to its role in assessment, an OSPE includes a focus on the individual competencies being tested, and the examination covers a broader range of practical skills than a 'traditional' examination. [5] The traditional examination in this department was an unstructured evaluation of different techniques, and was neither valid nor reliable, since every student was seen by a different examiner and given a different assessment task. In the OSPE, an individual's ability to perform a technique is tested in a more objective manner because all candidates are exposed to the same predetermined set of techniques and questions, which minimises the subjectivity of the assessment. [6] Mastering practical skills is an important aspect of a course like physiotherapy, [7] which means that its assessment component will influence the learning strategies of students. [8] However, if an assessment task is to achieve the desired outcome, it has to employ instruments that yield valid, accurate data which are consistent and reliable. In addition, inter-rater variability among examiners can be large, being informed by differences of opinion that are based on the subjective perception of individual examiners. [9] This lack of objectivity among examiners assessing practical skills was a problem area identified in this undergraduate physiotherapy department in the Western Cape, South Africa, and a departmental decision was made to pilot the OSPE. The aim of this study was to determine the perceptions and experiences of students and staff following the introduction of the OSPE format in the department. Since the OSPE was a new format for assessing practical competence, specifically developed to enhance objectivity, students and staff were approached and asked to describe their experiences and perceptions of the process following its initial implementation. The importance of both students' and staff attitudes and perceptions of the training programme in undergraduate health professions education was acknowledged.
    African Journal of Health Professions Education. 11/2013; 5(2):72-74.
  • [Show abstract] [Hide abstract]
    ABSTRACT: Assessment for practical skills in medical education needs improvement from subjective methods to objective ones. An Objective Structured Practical Examination (OSPE) has been considered as one such method. This study is an attempt to evaluate the feasibility of using OSPE as a tool for the formative assessment of undergraduate medical education in pharmacology. Students of second year MBBS, at the end of the first term, were assessed by both the conventional practical examination and the Objective Structured Practical Examination (OSPE). A five-station OSPE was conducted one week after the conventional examination. The scores obtained in both were compared and a Bland Altman plot was also used for comparison of the two methods. Perceptions of students regarding the new method were obtained using a questionnaire. There was no significant difference in the mean scores between the two methods (P = 0.44) using the unpaired t test. The Bland Altman plot comparing the CPE (conventional practical examination) with the OSPE showed that 96% of the differences in the scores between OSPE and CPE were within the acceptable limit of 1.96 SD. Regarding the students' perceptions of OSPE compared to CPE, 73% responded that OSPE could partially or completely replace CPE. OSPE was judged as an objective and unbiased test as compared to CPE, by 66.4% of the students. Use of OSPE is feasible for formative assessment in the undergraduate pharmacology curriculum.
    Journal of education and health promotion. 01/2013; 2:53.