Introduction
Competency-based medical education (CBME) involves the attainment of observable abilities by students in a time-independent, learner-centered manner.1 Medical education underwent a new change with center instructive goals being characterized. It was felt that a conventional method of teaching did not satisfy all the elements of assessment. So the aim was to train graduates efficiently to take care of the health needs of the society.2
A new pattern of Objective Structured Practical Examination (OSPE) was developed in 1975 and later modified by Harden and Gleeson in 1979. In OSPE each component of the competence is tested uniformly and objectively for all the students who are taking up a practical examination at a given place.3, 4
Despite the fact that, through OSPE one gets a reasonable idea of the extent of achievement of each student in every practical skill related to a particular discipline; still it is rarely used. It can be used for formative and summative evaluation. Conventional practical examination has many problems where the practical skills are not directly observed and the questions are directed towards the end of the session. Here comes the need of more evolved study design where educators have the responsibility for development of testing methods or procedures that fairly evaluates student’s achievements and yields accurate results.5
In our medical college the evaluation of practical skills is done by conventional techniques. Consequently, this study was intended to assess OSPE as a tool for evaluation of practical skills in the undergraduate medical students.
Materials and Methods
Study design & sampling method used are prospective interventional study, Cluster sampling. The study included 150 Phase 1 MBBS students of 2019-20 batch & faculty of Biochemistry. Post Institutional Ethical Clearance, consent was taken from the faculty and students. The faculty was sensitized to OSPE. CPE was conducted on topic “Estimation of Blood Glucose” and marks were awarded. Students were sensitized to OSPE and exam was conducted on same topic with 2 procedure & 4 response stations. Percentage graphs & Student’s’ test using MS Excel were used for statistical analysis.
Results
Significant difference in scores of CPE & OSPE were noted i.e. (OSPE > CPE). Students found questions understandable, easy to score, but some of them said that time was insufficient. Faculty perceived OSPE as an objective way of assessment without bias & included all domains.
Discussion
Assessment is a method which decides the adequacy and effect of exercises considering their goals. In conventional assessment technique most students are judged based on their cognitive domain and not for psychomotor and affective domain.4, 6
OSPE has been derived from Objective Structured Clinical Examination (OSCE) by Harden5 who said that the assessment of practical skills is often neglected due to unsatisfactory assessment tools. OSPE includes separate assessment of the process and product through observation of performance and assessment of end results. To overcome the drawbacks of conventional technique for assessment, OSPE proved to have advantage in assessing the psychomotor skills of the students whereas conventional methods have been criticized for lacking structure and standardization.
OSPE as a formative assessment will help achieve the above objectives and also to test the competencies as laid down in the CBME. Each component is assessed in turn and is the objective of one of the stations in the examination. At the Procedure station, the student’s psychomotor skill is assessed. At the response stations the cognitive skills are tested. The examiner uses a check list to record the performance. Since it was done for the first time at Deccan College of Medical Sciences in the Department of Biochemistry, we sensitized the faculty, trained the students before the OSPE followed by a questionnaire to know the perceptions of the students and faculty.
This assessment eliminates the remissness of experiments thereby improving the validity of test. Students take more interest because of assortment and keep themselves alert during the entire procedure of assessment, which is not found in traditional one.7
In our study, we tried to assess the perceptions of Phase 1 MBBS students in Biochemistry on their experience with OSPE and also to evaluate the perceptions of faculty regarding OSPE using a questionnaire. There was a statistically significant difference in the mean scores between the CPE and OSPE (P < 0.01). Studies done by Vijaya and Alan,8 KL Bairy et al.,9 and Mokkapati et al.10 noted that OSPE to be a well organised, easy and less stressful examination covering and testing the appropriate knowledge than conventional examination.
In a similar study by Faldessai et al., 90% of their student participants voiced that OSPE was a better examination pattern than conventional examination and it was better structured and uniform.11 Students voiced that OSPE assessed the relevant practical skills and it covered the appropriate knowledge consistent with the learning objectives. OSPE was well accepted by the students as they found OSPE easy to score and less stressful than CPE. OSPE was very much appreciated by the students.
Conclusion
In conclusion, OSPE is feasible and have good reliability and validity for evaluating practical skills of undergraduate medical students apparent by examiners and students. OSPE had many distinct advantages over CPE. The conventional practical examination methods showed that the students lacked competency even though they had more skills and knowledge, suggesting that their skills and knowledge are to be improved still more to make them more competent enough. Our study supports the introduction of OSPE in medical education for the evaluation of practical skills of undergraduate medical students. Considering its advantages even the faculty were in favour to use OSPE as the standard for performance based assessment test.