Objective: To assess the performance of students on clinical skill factors and to measure the satisfaction level of students related to the training.
Methods: The descriptive study was conducted at Rehman Medical College, Peshawar, Pakistan, from August 1 to September 15, 2013, and comprised all third-year medical students who had undergone clinical skill training. Their performance was evaluated through end-of-module objective structured clinical examination. Students' feedback measuring satisfaction on a five-point Likert scale was obtained on a designed validated tool. Monitoring of the clinical skills centre training programme was done by the quality enhancement cell at the college. SPSS 16 was used for statistical analysis.
Results: Of the 98 students who took the examinations, 94(96%) cleared generic stations and 70(72%) to 96(98%) discipline-based stations. Overall, 94(96%) cleared the first objective structured clinical examination, ranging from 83(84.6%) for Persian language conversation training to 98(100%) for general physical examination. In the second examination, 90(92%) students passed; ranging from 72(73%) for Gynaecology to 97(98.7%) each for Surgery and Ear, Nose and Throat. There was no significant difference between mean results of the two exams (p>0.05).
Conclusion: Clinical skills training achieved the desired objectives and outcomes. However, continuing studies need to be done to establish reliability of the programme.
Keywords: Clinical skills, Clinical competence, Undergraduate medical education, Educational measurement, Manikins, Patient simulation. (JPMA 67: 73; 2017)
The undergraduate medical students training of MBBS programme in Pakistan is, by and large, following the traditional system of five-year training where the students generally start clinical training on the opportunity-based patient encounters immediately after basic sciences course without having any simulation-based training. This may lead to suboptimal patient-doctor communication skill acquisition by novice students. Today's medical students and graduate doctors have significant deficits in their clinical skills. The actual bedside teaching has declined from 75% in 1960s to less than 20% today due to lack of patient encounter facility and the informed consent.1
To overcome this problem, clinical skills labs (CSLs) were developed globally and considered an appropriate setting by providing medical students an intermediate medium of training in clinical skills prior to actual patient encounter.2-4
The first CSL was established at Limburg University, Maastricht, the Netherlands, in 1976.5 Currently, CSLs are established in several innovative medical schools in the United Kingdom (UK), including the Universities of Leeds, Dundee, Dublin, Southampton, Liverpool and the Imperial College.6,7 In 1994, Virginia School of Medicine (United States) started clinical skills centre(CSC) based on simulated or standardised patients for instructional and assessment exercises. 8
Liaquat National Hospital (LNH) was the first in Pakistan to develop a well-equipped CSL in 1996.9 The lab offers certified training courses in life-saving techniques, e.g. basic life support (BLS) and advanced airway management. Since then a few other undergraduate institutions claim to have a CSC/CSL of some note.
The Rehman Medical College (RMC) was established in 2010. A CSC was established in December 2012 with the aim of imparting competence in clinical skills through simulations and simulated patients to medical students prior to their real patient contact.
The CSC initially provided training in clinical skills to the first batch of third-year MBBS students during 2012-13 session. The programme included communication skill, clinical examination on simulated patients and procedures on manikins. Special certification in cardiac first response (CFR) that is an accredited condensed course (BLS and advanced cardiac life support [ACLS]) was also conducted.
The current study was planned to do a preliminary evaluation of the CSC training programme based on assessment of competence of students on clinical skill stations (CSS) and to measure the satisfaction level of students from the training.
Materials and Methods
This descriptive study was conducted at the RMC, Peshawar, from August 1 to September 15, 2013. Third-year medical students who had undergone a training programme in clinical skills at the CSC were included. Those who had failed to attend 75% of the CSC training sessions or had incomplete data were excluded. Evaluation of the training programme was based on comparison of student skills in two successive objective structured clinical examinations (OSCEs) and obtaining a measure of student satisfaction level with their training programme.
The RMC CSC conducted two daily sessions of 75 minutes each with two batches of students who were further split into working groups of 2-3 students. Training in each session was skill-based with an introductory orientation through PowerPoint presentation and video clips followed by practical activities for skill acquisition on equipment/instruments, through procedures and performing physical examinations; manikins and/or simulations were used as needed. Keeping in view the local patient community, Persian language learning was incorporated as part of communication skill in each session. The training included clinical skills in each of the prescribed six MBBS clinical disciplines with skills ranging from generic to specific history taking and physical examinations. OSCE was used as the tool of formative assessments at the end of each session. A summative written and OSCE assessment was done at the end of module.
Data regarding skill acquisition was collected through a structured checklist while an indigenously designed questionnaire was used for measuring student satisfaction tools.
Competency of student was assessed through OSCE on two consecutive modules; each module was assessed during the session (formative assessment) and then at the end of session (summative assessment). One batch of students went through the first summative OSCE at the end of Module 11 (fundamentals of disease). Ten OSCE stations including 8 observed and 2 static stations were administered during the module and then at the end of foundation module. Each station lasted five minutes. Observed, response and interactive (viva) stations were administered on general history taking, general physical examination, abdominal and thorax examinations, eye, ear, nose and throat (ENT) and gynaecological examinations.
The same batch underwent further training in the next module (Module 12: Acute Healthcare) followed by the same pattern of summative assessment used for the previous module. For the second OSCE, conducted after Module 12, OSCE stations were clinical discipline-based, with three stations per discipline of Medicine, Surgery, Obstetrics/Gynaecology, Paediatrics, Ophthalmology and Otorhinolaryngology. There were 18 OSCE stations, including 15 observed and 3 static stations.
Feedback was taken by asking 6 questions responded to on a 5-point Likert scale by the students for the first module and on 9 questions for the second module.
SPSS 16 was used for statistical analysis. Continuous data was analysed for mean and standard deviation, while categorical data was presented as frequencies and percentages.
Of the 98 students, 94(96%) passed the first OSCE; ranging from 83(84.6%) for Persian language conversation training to 98(100%) for general physical examination (Table-1)
In the second OSCE, 90(92%) students passed rang offing from 72(73%) for Gynaecology to 97(98.7%) each for Surgery and ENT (Figure)
There was no significant difference between mean results of the two OSCEs (p>0.05).
End-of-module feedback showed 89(91.3%) students were satisfied. Of all the respondents, 88(90%) said they actually performed clinical examinations in CSC, 86(88%) agreed that the checklists provided to them were helpful, 91(93%) were satisfied with the teaching technology and 92(94%) thought CSC teaching was effective (Table-2)
End-of-the-session feedback from 80(81.6) students revealed 77(96.2%) believed CSC courses were well organised, 76(95%) thought course activities were appropriate to objectives, 75(93.7%) felt confident in applying skills learnt in CSC on real patients, 60(75%) said they would recommend establishment of CSC in other medical colleges.
RMC CSC achieved its primary objective of effective teaching of clinical skills (history-taking and examination) to medical students, decreasing the anxiety among the students by bridging the gap between classroom and the clinical workplace. Acceptable outcome was based on formative assessment.
A study in Pakistan10 rated third-year MBBS student satisfaction as 'positive' with a mean score of 3.32±0.53 out of a maximum of 5.0, translating into a satisfaction rating of 66.4%.
A study conducted in Saudi Arabia11 on the role and utility of a newly established CSC also showed a successful induction and completion of courses in a new modular system based on early clinical encounter of students. Not only was there a 400% increase in the use of the centre over a period of three academic years, student engagement and enthusiasm were noted and assessed through structured OSCE programmes; the authors advocated greater use of CSCs on a global level.
Peeraer G et al.7,8 also used OSCE as evaluation tool comparing clinical training in CSL. Our mode of assessment, i.e. OSCE, is considered to be one of the most reliable and valid measures of clinical performance ability currently available, as shown in 'show how' level of Miller's Pyramid, which combines the reality of live clinical interactions with the standardisation of problems and the use of manikins. 9
Dacre et al.12 showed a 14% improvement in the skills of intravenous drug administration, assessed by OSCE, after two years of specific skills centre training. Studies have shown that students who graduated from innovative medical schools used more skills during clerkships than students who had followed traditional programmes.13 Ledingham and Harden emphasise that medical schools cannot rely on clerkship experiences alone to provide adequate basic skills training. 14 Patients reserve the right not to be involved with students15 In addition to cultural issues, ethical issues are raised when genital, vaginal, rectal and breast examinations are to be done.16 These factors as well as the invasion of the medical field by computer technology has led to the increase in the number of CSLs and the use of simulation as an innovative teaching approach to medical education.17-19
The role of the CSC training programme in developing clinical skills cannot be underestimated; for example the core curriculum of the Dundee Medical School CSC has been presented in detail by Syme-Grant et al. and provides a good reference framework for any newly established CSC.20 In our study, one of the essential evaluation measures was student feedback, which reflected the confidence levels of students in attaining clinical skills. Moreover, standardisation of clinical skills has a role in selection of international medical graduates (IMGs) for jobs in developed countries, as described by Sonderen et al.;21 developed nations may judge suitability of IMGs by assessing their clinical skills in CSCs prior to approval for clinical practice in their workplaces.
The newly-established CSC achieved the desired objectives and outcomes as results showed a positive impact of clinical skills training on the participants. All medical colleges should embark on baseline CSC training programmes with incremental improvements over time.
1.Ahmed AM. Deficiencies of history-taking among medical students. Saudi Med J. 2002;23:991-4.
2.Hao J, Estrada J, Tropez-Sims S. The clinical skills laboratory: a cost-effective venue for teaching clinical skills to third-year medical students. Acad Med. 2002; 77:152.
3.Bradley P, Postlethwaite K. Setting up a clinical skills, learning facility. Med Educ. 2003;37:6-13.
4.Tolsgaard MG. Clinical skills training in undergraduate medical education using a student-centered approach. Dan Med J. 2013; 60:B4690.
5.Al-Yousuf NH. The clinical skills laboratory for medical students and health professionals. Saudi Med J. 2004;25:549-51.
6.Bradley P, Bligh J. One year's experience with a clinical skills resource center. Med Educ.1999; 33:114-20.
7.Shahid Hassan. Teaching and learning clinical skills at clinical skills centre in Medical Institution-Are we denying our students this effective skill lab method in developing countries. Pak J Otolaryngol.2007;23:54-5.
8.Peeraer G, Scherpbier AJ, Remmen R, De winter BY, Hendrickx K, van Petegem P, et al. Clinical Skills Training in a Skills Lab Compared with Skills Training in Internships: Comparison of Skills Development Curricula. Educ Health. 2007;20:125.
9.Department of Skill Lab [Online] [Cited 2013 October 26]. Available from: URL: www.lnh.edu.pk/Departments/ Facilities/skills_lab.php
10.Quadri KHM, Rahim MF, Alam AY, Jaffery T, Zaidi Z, Iqbal M. The structure and function of a new Clinical Skills and Medical Informatics Laboratory (SCIL) in a developing country - a two year institutional experience. JPMA. 2008; 58:612-5.
11.Boker A. Setup and utilization of clinical simulation center, Faculty of Medicine, King Abdulaziz University, Saudi Arabia. Life Sci J. 2013; 10:1079-85.
12.Dacre JE, Jolly B, Griffifths S, Noble G. Giving intravenous drugs - students should be trained and tested. BMJ.1993;3007:1142.
13.Harden RM, Gleeson FA. Assessment of clinical competence using Objective structure Clinical Examination OSCE. Med Educ. 1979;13:41-54.
14.LedinghamMcA,Harden RM. Twelve tips in setting up a clinical skills training facility. Medical Teacher. 1998;20:503-7.
15.Bradley P, Postlethwaite K. Setting up a clinical skills learning facility. Med Educ. 2003;37:6-13.
16.Monnickendam SM, Vinker S, Zalewski S, Cohen O,Kitai E. Patients' attitudes towards the presence of medical students in family practice consultations. Isr Med Assoc J. 2001; 3:903-6.
17.Dent JA. Current trends and future implications on the developing role of clinical skills centers. Med Teach. 2001; 23:483-9.
18.Sebiabi SM. New trends in medical education: the clinical skills laboratories. Saudi Med J. 2003;24:1043-7.
19.Dacre J, Nicol M, Holroyd D, Ingram D. The development of a clinical skills centre. J Roy Coll Physicians. 1996; 30:318-24.
20.Syme-Grant J, Stewart C, Ker J. How we developed a core curriculum in clinical skills. Med Teach.2005;27:2:103-6.
21.Sonderen MJ, Denessen E, Cate OTJT, Splinter TAW, Postma CT. The clinical skills assessment for international medical graduates in The Netherlands. Med Teach. 2009; 31:e533-8.
This journal is a member of and subscribes to the principles of the Committee on Publication Ethics.