Assessing the performance of junior doctors in the workplace is important but challenging. The optimum assessment is by direct observation of doctors’ interactions with patients and comprises multiple assessments by multiple examiners on a variety of patient problems. Clinical supervisors are best suited to observe and certify trainees, but often do not observe them directly.1 Performance assessment is not done well in most instances, as it requires multiple sampling over time.2 In-training assessments done at the end of a term introduce a “halo effect”.3
Most of these problems can be overcome by the mini clinical evaluation exercise (mini-CEX), developed by the American Board of Internal Medicine.4 The mini-CEX involves direct observation of a trainee in a focused clinical encounter, followed by immediate feedback. The assessment is recorded on a rating form that has been shown to have high internal consistency and reliability among internal medicine trainees, giving scores comparable with a high-stake clinical examination.5,6 The mini-CEX has higher fidelity than other formats.7
International medical graduates (IMGs) comprise about 25% of the medical workforce in developed countries.8 Their certification for registration is a major task of the medical boards and registration authorities in Australia and other countries.9 The Australian Medical Council (AMC) has conducted clinical examinations to assess IMGs since 1978.10 Successful candidates undertake 12 months of supervised practice before obtaining full registration. Despite having passed the current AMC clinical examination, IMGs’ competence and performance in the workplace have been criticised.11
Because of differences in the number of encounters per participant, we included a maximum of eight encounters in the generalisability study. The results of the variance components estimation are shown in Box 1. The G coefficient for eight encounters was 0.88. As a measure of discrimination, the standard error of measurement for the measurement design with eight encounters was estimated at 0.35 (that is, 19/20 times, the “true” score of an IMG will fall within ± 0.69 of an observed score). The results of the D study indicated that 10 encounters were necessary to achieve a reliability of 0.90 (Box 2).
Under the conditions and settings used, the mini-CEX reliably assessed the clinical performance of IMGs with eight to 10 encounters. This is consistent with the results of other studies.7 As the mini-CEX is conducted within the workplace with real patients, it has high fidelity and it is acceptable to both IMGs and examiners. A fail rate of 9% (19/209 encounters) across 12 IMGs is concerning, given these IMGs had passed the AMC clinical examination.
- Balakrishnan R Nair1,2
- Heather G Alexander3
- Barry P McGrath4
- Mulavana S Parvathy2
- Eve C Kilsby3
- Johannes Wenzel5
- Ian B Frank6
- George S Pachev7
- Gordon G Page7
- 1 University of Newcastle, Newcastle, NSW.
- 2 John Hunter Hospital, Newcastle, NSW.
- 3 Griffith Institute for Higher Education, Griffith University, Brisbane, QLD.
- 4 Monash University, Melbourne, VIC.
- 5 Southern Health, Melbourne, VIC.
- 6 Australian Medical Council, Canberra, ACT.
- 7 University of British Columbia, Vancouver, British Columbia, Canada.
The study was supported by a grant from the AMC. We thank all the supervisors, international medical graduates and patients.
None identified.
- 1. Holmboe ES. Faculty and the observation of trainees’ clinical skills: problems and opportunities. Acad Med 2004; 79: 16-22.
- 2. Williams RG, Dunnington GL, Klamen DL. Forecasting residents’ performance — partly cloudy. Acad Med 2005; 80: 415-422.
- 3. Wilkinson TJ, Wade WB. Problems with using a supervisor’s report as a form of summative assessment. Postgrad Med J 2007; 83: 504-506.
- 4. Norcini JJ, Blank LL, Arnold GK, Kimball HR. The mini-CEX (clinical evaluation exercise): a preliminary investigation. Ann Intern Med 1995; 123: 795-799.
- 5. Durning SJ, Cation LJ, Markert RJ, Pahgaro LN. Assessing the reliability and validity of the mini-clinical evaluation exercise for internal medicine residency training. Acad Med 2002; 77: 900-904.
- 6. Hatala R, Ainslie M, Kassen BO, et al. Assessing the mini-Clinical Evaluation Exercise in comparison to a national specialty examination. Med Educ 2006; 40: 950-956.
- 7. Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann Intern Med 2003; 138: 476-481.
- 8. Mullan F. The metrics of the physician brain drain. N Engl J Med 2005; 353: 1810-1818.
- 9. McGrath BP. Integration of overseas medical graduates into the Australian medical workforce. Med J Aust 2004; 181: 640-642. <MJA full text>
- 10. Breen K, Frank I, Walters T. Australian Medical Council: the view from inside. Intern Med J 2001; 31: 243-248.
- 11. Van Der Weyden MB, Chew M. Arriving in Australia: overseas trained doctors [editorial]. Med J Aust 2004; 181: 633-634. <MJA full text>
Abstract
Objective: To evaluate the feasibility, reliability and acceptability of the mini clinical evaluation exercise (mini-CEX) for performance assessment among international medical graduates (IMGs).
Design, setting and participants: Observational study of 209 patient encounters involving 28 IMGs and 35 examiners at three metropolitan teaching hospitals in New South Wales, Victoria and Queensland, September–December 2006.
Main outcome measures: The reliability of the mini-CEX was estimated using generalisability (G) analysis, and its acceptability was evaluated by a written survey of the examiners and IMGs.
Results: The G coefficient for eight encounters was 0.88, suggesting that the reliability of the mini-CEX was 0.90 for 10 encounters. Almost half of the IMGs (7/16) and most examiners (14/18) were satisfied with the mini-CEX as a learning tool. Most of the IMGs and examiners enjoyed the immediate feedback, which is a strong component of the tool.
Conclusion: The mini-CEX is a reliable tool for performance assessment of IMGs, and is acceptable to and well received by both learners and supervisors.