M. P. Sandila ( Basic Health Sciences, Ziauddin Medical University, Karachi. )
A. Ahad ( Basic Health Sciences, Ziauddin Medical University, Karachi. )
Z. K. Khani ( Basic Health Sciences, Ziauddin Medical University, Karachi. )
Objective: To develop a competency based discriminatory assessment method for physiology practical examination
Method: Results from 1st professional M.B.B.S. Part I and II of three batches were taken and students performance in traditional and objective structured practical examination (OSPE) were compared. The course objective for practical examination of all three batches were same. However, Batch II appeared in the conventional examination, while Batches Ill and IV were examined by OSPE.
Results: The mean score of Batch II was 68± 6, of Batch III 53 ± 13 and Batch IV 50 ± 16. Batch II had thus an overall higher score as compared to Batches Ill and IV. The comparison of mean scores using ANOVA showed a significant (P<.001) difference between scores of Batch II as compared to Batches Ill and IV. Tukey’s pair wise comparison of the batches showed a significant difference between batches II and Ill (95% CI for difference: 9.1, 20.5 with P <0.001) and batches II and IV (95% Cl for difference: 12.2, 23.6 with P<0.001). However, no significant difference was found between batches Ill and IV (95% Cl for difference: -2.6, 8.8 with P =0.27). The result also showed that Batch II with conventional method of examination had a lesser spread around the mean (scores ranging from 52 to 81) as compared to Batch III (25 to 80) and Batch IV (14 to 90).
Conclusion: OSPE is an effective tool to discriminate between good and poor performers in physiology practical examinations (JPMA 51 :207:2001).
Ziauddin Medical University (ZMU) is now in its fifth year of existence. Initially for the first two years, we followed the traditional assessment method in vogue at the different medical schools in Pakistan for Physiology practical examinations. This system entails, asking the candidate to select a sealed folder and perform the two experiments listed in that folder thus it starts with a chance factor. It also incorporates subjectively, as the student is evaluated two hours later on the basis of few orally asked questions, some of which may even not be relevant to the experiments he has performed.
Incorporation of chance factors and subjectively associated with this form of examination, lead to the need for a reliable method of assessment at ZMU. The faculty after long deliberations decided to replace it with an Objective Structure Practical examination (OSPE). This is a version of the Objective Structured Clinical Examination (OSCE) which has been in use in clinical teaching since 19711 and has been found to be a reliable and valid assessment tool to test competency of students in clinical practice2. OSPE is a specified set of task that every candidate is expected to perform in the presence of examiners3.
As the students during practicals learn a variety of skills and also interpret the data obtained, a list of observable performance related and interpretive exercises in physiology were prepared. A checklist was prepared for observable performance related exercise after faculty debates for each of the exercise students perform in the laboratory. A prototype of a checklist used for examination of the 1St cranial nerve (Olfactory) is given in Table 1.
Each, item of the checklists starts with an action verb, indicating that each is an observable behavior. The checklists are are dichotomous with each performance element being either marked as performed satisfactorily or unsatisfactorily. Some critical points were then identified in each of the checklists and assigned penalty points. If a student failed to perform the critical point or performed it unsatisfactorily, a penalty point was deducted from his total score.
For interpretive type of examination, again the faculty prepared a list of objectives from the course objectives for physiology, which could be tested by interpretation. The cognitive domain of students was tested by providing them with a scenario, data diagram or slide and candidates asked to answer multiple true false or matching types of questions. A prototype of an interpretive station used in one of the examinations is given is Figure 1.
Statistical analysis was done using Minitab version 4, a software programme. The mean scores were compared using ANOVA and Tukey’s pair wise comparison tests.
In academic year 1996 (Batch-II), 54 students appeared in the 1st Professional MBBS examination and were examined by the traditional method. In academic years 1997 (Batch-III) and 1998 (Batch-IV), 56 and 54 students appeared respectively who were tested by OSPE.
Table 2 shows the mean marks + standard deviations of the three batches. The mean marks of Batch-Il students are significantly higher than mean marks of Batch-Ill and Batch-IV.
A box plot of scores obtained by the students of all three batches is shown in Figure 2.
The box representing middle 50% of the students of Batch-Ill and IV have a much wider distribution than those of Batch-Il. The minimum marks obtained by students of batch-Ill are significantly higher when compared with those of the other two batches.
Dismissal for incompetent performance during medical education is rare4. Also, in training evaluations although widely used are unable to differentiate clearly among different dimensions of competence5 or to distinguish clearly and reliably between different levels of performance, particularly at or around a standard acceptable performance6. The shortcomings of oral examinations and other highly prevalent assessment approaches have also been thoroughly documented7. It was in light of this available evidence that we started OSPE which seems to be a reliable method that splits students into a large number of groups in terms of traditional method of exam ination. After examining two batches with the traditional system, the pattern of examination was changed to OSPE and the results showed that the mean score of batches examined by traditional method by OSPE. There was however no obtained in physiology practical.
significant difference between scores of batches examined by the same method. This suggests that OSPE splits the students into a large number of groups in terms of measured competence or discriminates between different levels of competence better than the traditional •method of examination.
The test can be used to incorporate a large number of questions and skills from a wide variety of physiological phenomenon and the students tested from a much wider sphere of cognitive competency skills compared to the traditional method of examination. It can also be designed to incorporate questions and skills according to a predetermined weight age for testing different faculties.
In the traditional method a number of chance factors are incorporated, whereas in OSPE as the whole examination is structured, each student has an equal chance.
OSPE has been highly appreciated by the examiners both local as well as foreign (invited from UK) as an effective tool to discriminate between good and poor performers in physiology practical examinations. OSPE appears to be a reliable measurement tool to discriminate between good and poor performance in physiology practical examination.
1.Harden R. Stevenson M, Downie W. et al. Assessment of clinical competence using an objective structured examination Br. Med. J.. 1975:1:447-51.
2.Hodges B. Regehr G, Harison M, et al. Validation of an Objective Structured Clinical Examination in Psychiatry. Acad. Med., 99 73:910-12.
3.Nayer U, Malik SL. Bijiani RL. Objective Structured Practical Examination (OSPE). A new concept in assessment of laboratory exercise in pre-clinical sciences. Med. Educ., 1986:20:204-9.
4.Crowley AE, Etzel SI. Peterson Es. Undergraduate medical education JAMA, 1984:252:1524-32.
5.Haber RJ, Avins AL. Do ratings on the American Board of International Medicine Resident Evaluation form, detect differences in clinical competence? J. Gen. Intern. Med,, 1994:9:140-45.
6.Grey i. Global rating scales in residency education. Acad. Med. 1996:7 :(Suppl):S55-S63.
7.Caftan P, Tallet S. Rothrnan A. A guide to resident assessment for program directors. Ann. R. Coll.Phys. Surg. Can, 1997:30:403-9.