Objective: To evaluate medical students’ views about undertaking structured long interview and clinical examination as a formative assessment.
Method: The qualitative, exploratory study was conducted from February to July 2019 at the Islamic International Medical College, Riphah International University, Islamabad, Pakistan, and comprised final year medical students having undertaken formative assessment through structured long interview and clinical examination during their clerkship rotation in Paediatrics, General Medicine, General Surgery and Gynaecology and Obstetrics. Four sets of focus group discussions were conducted according to the relevant clerkship module. Each recorded FGD was transcribed verbatim. Thematic analysis was conducted manually.
Results: Of the 32 students, there were 8(25%) in each of the four groups. Five major themes and five-sub-themes emerged, with the main themes being: Purpose, Learning, Timing, Relevancy and Fairness of the structured long interview and clinical examination.
Conclusion: The students generally thought that the structured long interview and clinical examination was effective in enhancing their clinical skills learning and should be conducted more frequently with minor adjustments.
Keywords:Medical students, Opinions, Professional competence Assessment, Views. (JPMA 71: 2000; 2021)
The assessment of clinical competencies in medical education has made considerable progress over the last half-a-century.1 It has evolved as an integral component of learning as medical education has witnessed a paradigm shift from ‘assessment of learning’ to ‘assessment for learning’.2 Two types of assessments have been described for medical education: formative and summative.3 Formative assessments serve to reinforce students’ intrinsic motivation to learn, inspiring them to set higher standards for themselves. On the other hand, summative assessment makes an overall judgment of the students’ competencies without providing them feedback.4 These are designed to provide professional self-regulation and accountability.3 A variety of formative tools are used to assess the clinical competencies of a medical student, including Objective Structured Long Examination Record (OSLER), Case-based Discussion (CbD), Objective Structured Clinical Examination (OSCE), Mini Clinical Evaluation Exercise (MiniCEX), and Direct Observation of Procedural Skills (DOPS).5-12 The traditional long case has been the subject of a lot of critical reviews owing to its lack of structure, objectivity, reliability and validity. Examining the student in one clinical skill case does not guarantee its generalisability across other clinical cases as well.13 Moreover, the long-case viva is taken without a preset structure and is left completely open to examiner bias.14 Several assessment methods are given in literature to modify the traditional long case exam. An assessment tool used in the clinical settings of the Riphah International University (RIU), Islamabad, Pakistan, is the structured long interview and clinical examination (SLICE), which is believed to have been achieving milestone results since its commencement.
The current study was planned to indirectly identify the shortcomings of SLICE through documented feedback from students for improving and standardising the assessment tool for use at national and international levels.
Subjects and Methods
The qualitative, exploratory study was conducted from February to July 2019 at the Islamic International Medical College (IIMC), RIU, Islamabad, Pakistan. After approval from the institutional ethics review committee the sample was raised using purposive sampling technique from among final year medical students who had undertaken a formative assessment through SLICE during their clerkship rotation in the departments of Paediatrics (Paeds), General Medicine (GM), General Surgery (GS), and Obstetrics and Gynaecology (OB-GYN). Those who had to repeat any year in medical college due to poor academic performance and those who had not completed their rotation before the commencement of the study were excluded.
The SLICE was developed by the medical education faculty at IIMC to assess clinical competency of undergraduate students in a short duration of 15 minutes, with an easy marking system compared to other tools that require 20-30 minutes to complete the assessment. SLICE comprises two components and five sections (Annexure).
Although history-taking is unobserved, the examination part is directly observed by the examiner. This deficit is overcome by the examinee asking pertinent questions during history-taking from the patient under observation and counselling the patient regarding the diagnosis and follow-up. Reliability check for SLICE gave the value of 0.87 which is equal to the face value of OSCE and OSLER, the two most widely-used tools for assessment in medical education. The content validity of SLICE was checked by calculating the content validity index (CVI) and eigenvalues, which were 0.90 and 0.92 for relevance and clarity,15,16 indicating SLICE to be a reliable and valid tool.
After taking informed consent form the participants, qualitative data was collected using focussed group discussions (FGDs), with each of the four sessions lasting 90 minutes. The basic questionnaire for FGDs was aimed at recording students’ opinion and views regarding SLICE, and was finalised after validation by experts in the field of medical education.
If any point needed explanation during an FGD, the facilitator asked further questions to probe the topic. Each FGD was audio-recorded and was subsequently transcribed verbatim. Thematic analysis was conducted manually in 4 stages. In the first stage, specialty-wise coding was done, leading to four sets of codes. In the second stage, the codes were re-arranged into themes and sub-themes, giving the codes a more robust and meaningful structure. In the third stage, themes from all specialities were combined into a single file which resulted in 11 themes and 32 sub-themes, which were re-evaluated in the fourth stage where repeated themes and sub-themes were combined into thematic terms covering larger domains. To achieve adequate coding and researcher reliability, investigator triangulation was performed. After the initial thematic analysis, two investigators independently analysed the data. Before the data was finalised, all investigators reached a consensus upon the themes that had emerged, ensuring triangulation of the analysed data.
Of the 32 students, there were 8(25%) in each of the four groups. Data analysis produced a final set of 5 major themes and as many subthemes (Table).
The first theme was ‘purpose of SLICE’. The students found SLICE to be a purposeful formative assessment tool during clinical clerkship that helped them overcome their identified shortcomings in clinical competence and supported them in their preparation for their final examination. According to a student, it assessed the implementation of their theoretical knowledge in clinical practice. “Slice basically helps in learning how to interact with the patient. It is learning about how to build a diagnosis and present it to the supervisor. It helps to correlate our clinical knowledge and formulate a diagnosis.” (F03: GS) Another student found it useful in terms of time management. “The biggest advantage is that I can become aware of my weaknesses and as the assessment is time-bound, I can also learn time management.” (M02: P)
The second theme was ’students’ learning‘. Most students reported that their knowledge, attitude and skills had enhanced with patient interaction, examination and by making a clinical diagnosis after strong observation. According to them, an exposure of rare syndromes and appropriate usage of medical equipment strengthened their clinical skill. One of them said: “I can corelate my academic knowledge and clinical knowledge and I can clinically implement my knowledge or reflexes on the patients.”(M05: GM) Another one stated: “I practically apply my theoretical knowledge (and) this improves our interaction with the patient.” (F03: GS).
Students found that SLICE helps to indicate their level of competence in different areas of competencies e.g. history taking, physical examination& diagnosis making “I really like this assessment method as it makes me aware about my level of knowledge, skills and ability to use all this to interact, examine and diagnose the patient. At the end, we are skilled enough to handle the patient independently.” (F08: G&O)
Another student reported confidence boost through patient interaction which is an essential component for attitude enhancement. “I have been doing clinical practice in the 4th year. By doing SLICE, my confidence levels have surely increased now. It is easier to approach a patient and conduct an examination. It is easier to handle a patient as some give their history willingly while others give us a tough time and they need counselling. I think SLICE has brought betterment in that aspect too.” (M02: P)
The third theme was ‘timing of SLICE”. The students suggested that SLICE must be used during midterms which may help them prepare for the final-term examination.
The fourth theme identified was ‘relevancy of SLICE. The students believed that SLICE had been used to assess knowledge, practical skills and behavioural change related to current evidence-based practices in medicine.
The final theme was ‘fairness of SLICE’. According to the students, SLICE was found to be a fair assessment tool which effectively highlighted their weak areas during the clinical clerkship period.
To the best of our knowledge, the current study is the first to record undergraduate students’ opinion and views regarding SLICE as a formative assessment tool.
The students identified that the main purpose of SLICE was to help them clinically apply theoretical knowledge and concepts that they had gained from reading scientific literature in textbooks and scientific journals, and from didactic lectures. Workplace-based assessments (WBAs) are specifically designed for enhancing the clinical experience of students.17 This involves an interactive session with a supervisor where they present their findings. By using a critical, cross-questioning approach, the supervisor informs the students about their shortcomings and tells them about the latest medical advances related to the relative clinical cases. This detailed observation and feedback process leads to an improved understanding of theoretical and clinical concepts and enhancement of patient-handling skills. Such detailed analysis has been found to be lacking in certain other WBAs, such as the mini-CEX and the OSCE.18,19 An additional purpose and advantage that the students pointed out for SLICE were better preparation of final high-stakes examination. Increased exposure to the examination environment serves as a confidence-booster for them.19
As the main purpose of the SLICE is to improve students’ knowledge, their overall competencies are polished. Having a detailed cross-questioning session with the supervisor provides the students with an insight into their concepts, rectifying any incorrect mechanisms or correlations that they may have committed to memory. Learning is not a priority objective for a student during a high-stakes examination compared to SLICE.20 The students are more concerned about giving the right answer to perform well. Also, the student gets only one chance for a long case viva, thus limiting the exposure of the student to one clinical condition as well.13 In SLICE, the student gets multiple chances themselves and by observing their peers. During OSLER, students also get exposure to multiple clinical scenarios since it is used formatively as well.21
Furthermore, since the students are required to arrive at a provisional diagnosis without the help of laboratory investigations, they get the opportunity to critically analyse the history as well as signs and symptoms of the patient. This opportunity is not available during either the traditional long case or OSLER.14,21 Exposure to patients at the bedside in an examination environment also leads to improved communication skills. The role of communication skills in medical education has been repeatedly emphasised.22,23 Since the students are aware that they are being observed in SLICE, they knowingly practise their communication with patients as well. Similarly, their physical examination skills get polished.
Generally, the students in the current study opined that SLICE should be conducted as a mid-module assessment. One student, however, believed SLICE should be conducted at the end of the module once the modular teaching had finished.
The optimal method of conducting any workplace-based formative assessment is to have repeated sessions throughout the module until the student has achieved the required competency levels.24 Similarly, OSLER, when used formatively, is conducted both during and at the end of a clinical module.21 In contrast, the long case examination is only conducted at the end of modules as a high-stakes examination.25
An in-depth interaction with the patient under supervision, followed by critical feedback provides the student with the opportunity to improve the breadth as well as the depth of knowledge. Not only do students learn how to present their cases to the supervisor, but their physical examination and soft skills also improve. Furthermore, since the students go through SLICE multiple times and observe other students as well, they are exposed to a variety of clinical cases. In comparison, since the students are not supervised during the long case, students do not focus on improving either their physical examination or soft communication skills.23,25
Since SLICE is designed in a way that each component is marked separately, this results in two-fold benefits. Firstly, the area where the student has a low score is highlighted, being a clear indicator that the student did not perform well in this specific component. Secondly, even if the students are not able to perform well in one odd component of the exam, they still have a good chance of passing because each component is graded mutually, exclusive of other parts of the examination.
The limitations of the current study is that SLICE is used as an examination tool at RIU and has not been validated internationally. The validity and reliability of SLICE need to be established at national and international levels. The present study recorded the opinion and views of RIU medical students only. Due to time constraints, taking all of the final year students was not possible. As a qualitative analysis, the study has only explained the role of SLICE in students’ learning. Other parameters, such as SLICE-related problems, were not considered.
Future studies should employ a mixed-method design. A quantitative trial should be conducted where one group of students undergo SLICE, while another group must go through an alternative examination, such as OSLER, to assess the clinical competencies. Moreover, studies should also focus on recording the perceptions of medical students from more than one college.
The SLICE tool should be assessed further in the context of other parameters, such as calculating all types of validity and reliability of SLICE, and to explore SLICE-related problems, thereby providing suggestions for improvements in its future implementation in the curriculum. The impact and effect of SLICE can be checked for both formative and summative assessment purposes.
The students in the study opined that SLICE was a very effective tool in helping correlate theoretical knowledge with clinical skills. Also, since their SLICE sessions were supervised, they received critical feedback on the spot, highlighting any shortcomings in their clinical performance or knowledge base.
Disclaimer: The text is based on an academic thesis, and was presented at the International Conference on Medical Education, 2019, in the oral category.
Conflict of interest: None.
Source of Funding: None.
1. Van Der Vleuten CP, Scherpbier AJ, Dolmans DH, Schuwirth LW, Verwijnen GM, Wolfhagen HA. Clerkship assessment assessed. Med Teach 2000; 22: 592-600.
2. Schuwirth LW, Van Der Vleuten CP. Changing education, changing assessment, changing research? Med Educ 2004; 38: 805-12.
3. Rauf A, Shamim MS, Aly SM, Chundrigar T, Alam SN. Formative assessment in undergraduate medical education: concept, implementation and hurdles. J Pak Med Assoc 2014; 64: 72-5.
4. Parakh K. Assessment in medical education. N Engl J Med 2007; 356: 2108.
5. Yanting SL, Sinnathamby A, Wang D, Heng MTM, Hao JLW, Lee SS, et al. Conceptualizing workplace based assessment in Singapore: Undergraduate Mini-Clinical Evaluation Exercise experiences of students and teachers. Ci Ji Yi Xue Za Zhi 2016; 28: 113-20.
6. Castanelli DJ, Jowsey T, Chen Y, Weller JM. Perceptions of purpose, value, and process of the mini-Clinical Evaluation Exercise in anesthesia training. Can J Anaesth 2016; 63: 1345-56.
7. Düzgün ŞA, Zeren S, Bayhan Z. Objectively structured verbal examination to assess surgical clerkship education: An evaluation of students’ perception. Turk J Surg 2018; 34: 9-12.
8. Majumder MAA, Kumar A, Krishnamurthy K, Ojeh N, Adams OP, Sa B. An evaluative study of objective structured clinical examination (OSCE): students and examiners perspectives. Adv Med Educ Pract 2019; 10: 387-97.
9. Ramachandran G, Ko KMA, Ghosh S. The “failure to fail” phenomenon in the clinical long case examination. Quest Int J Med Health Sci 2019; 2: 3-7.
10. Shah SI, Baig M, Shah S, Bashir EA, Sarwar H, Shah JA. Inter-rater reliability of Objective Structured Long Examination Record. J Ayub Med Coll Abbottabad 2018; 30: 180-3.
11. Skrzypek A, Szeliga M, Stalmach-Przygoda A, Górski S, Kowalska B, Kocurek A, et al. The Objective Structured Clinical Examination (OSCE) from the perspective of 3rd year’s medical students—a pilot study. Folia Med Cracov 2017; 17: 67-75.
12. Wanjari S, Vagha S. Utility of OSLER for assessing enhancement of learning in postgraduate students. South East Asian j Med Educ 2020;13: 37.
13. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet 2001; 357: 945-9.
14. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach 2007; 29: 855-71.
15. Waseem AG, Sharif S, Habib MS, Khan RA, Hameed U, Hameed A. Establishing Validity of SLICE–As an Assessment Instrument of Long Case. J Islam Int Med Coll 2019; 14: 43-7.
16. Coughlan J, Rolfe I. The effect of a structured question grid on the validity and perceived fairness of a medical long case assessment. Med Educ 2000; 34: 46-52.
17. Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65: S63-7.
18. Newble DI, Hoare J, Elmslie RG. The validity and reliability of a new examination of the clinical competence of medical students. Med Educ 1981; 15: 46-52.
19. Nasir AA, Yusuf AS, Abdur-Rahman LO, Babalola OM, Adeyeye AA, Popoola AA, et al. Medical students’ perception of objective structured clinical examination: a feedback for process improvement. J Surg Educ 2014; 71: 701-6.
20. Jyothirmayi R. Case-based discussion: assessment tool or teaching aid? Clin Oncol (R Coll Radiol) 2012; 24: 649-53.
21. Gleeson F. AMEE medical education guide No. 9. Assessment of clinical competence using the Objective Structured Long Examination Record (OSLER). Med Teach 1997; 19: 7-14.
22. McManus I, Vincent C, Thom S, Kidd J. Teaching communication skills to clinical students. BMJ 1993; 306: 1322-7.
23. Doherty E, McGee H, O'Boyle C, Shannon W, Bury G, Williams A. Communication skills training in undergraduate medicine: attitudes and attitude change. Ir Med J 1992; 85: 104-7.
24. Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR, Collaborators IC. The role of assessment in competency-based medical education. Med Teach 2010; 32: 676-82.
25. Ponnamperuma GG, Karunathilake IM, McAleer S, Davis MH. The long case and its modifications: a literature review. Med Educ 2009; 43: 936-41.