Membership into the Royal College of Psychiatrists (MRCPsych) allows entry into the specialist registrar grade and higher specialist training in psychiatry in the UK. Membership is contingent upon passing parts 1 and 2 of the examination. Both parts consist of a written and clinical examination, and passing the written allows progression to the clinical component. Part 1 can be attempted after a minimum 12 months postregistration training. Part 2 candidates must have passed the part 1 examination and have completed 30 months of full-time postregistration training. An objective structured clinical examination (OSCE) is now included in the part 1 clinical examination, replacing the individual patient assessment (IPA), where the candidate interviewed one patient for one hour followed by a 30 minute oral examination. There is no OSCE component to the final part 2 examination, which retains the IPA and patient management problems.
First described by Harden et al. in the 1970s, OSCEs are now an established examination technique used widely to assess clinical skills (1). It has been shown that OSCEs have higher reliability and validity than traditional and less structured oral examinations (2), allow a wider range of skills to be assessed, are more objective (3), and are increasingly accepted as a means of episodic performance-based assessment (4). More specifically, the psychiatry OSCE has been shown to have acceptable validity in assessing trainees (5). While used widely to assess the competence of medical students, physicians, nurses, pharmacists and physiotherapists, educators in psychiatry have been slower to adopt this method of evaluation (6).
A growing body of literature has been critical of the more traditional IPA, particularly in assessing the skills of medical students (4). It has been suggested that the luck of the draw element in examiner and patient selection in the IPA plays a significant role in exam score, particularly for borderline pass/fail candidates (7). The OSCE is thought to reduce these variables by improving both the validity and reliability of oral examinations (8).
The Royal College of Psychiatrists introduced the OSCE into the part 1 clinical examination in April 2003. Candidates rotate through 12 short examination stations. Each station consists of a specific task or scenario involving a standardized patient (actor) or a practical skill. There is a fixed time period of 7 minutes to complete the task before having to move to the next station. The examiner at each station has a standardized mark sheet to score the candidates on their performance in a number of categories. In an example given by the Royal College, a candidate is asked to explain "a diagnosis of schizophrenia and its implications" to a relative. They are scored from A (excellent) to E (severe failure) in a number of areas, including communication skills, features of schizophrenia, causal explanations, treatments and side effects, outcome, issues of risk, and global rating (9). The Royal College has decided to use the OSCE at the entry level part 1 examination to assess junior residents. Basic components of history taking, examination skills (e.g., cranial nerves, motor system, fundal examination), practical skills (e.g., application of ECT electrodes or ECG leads), emergencies (e.g., resuscitation), and communication skills (e.g., explaining treatment, consent to treatment, prognosis) can all be expected to be examined at this level. Candidates will need to demonstrate to examiners that they have achieved a satisfactory standard. A minimum of grade C or above in at least nine of the 12 stations is required to attain a passing score on the OSCE. A severe failure (grade E) in any one of the 12 stations will be considered as evidence that the candidate has not reached the required standard (9).
The first ever postgraduate psychiatry OSCE in the United Kingdom (UK) represented a unique opportunity to assess residents’ attitudes toward this form of assessment. For many of these residents, 4 weeks away from taking the new examination, this was the first time they would sit an OSCE, and for most, the first time they would sit an OSCE in psychiatry. In this study, our objective was to evaluate their experience and views on its inherent ability to assess their clinical skills by using a similar approach taken by a study examining Canadian residents (2).
A questionnaire used to assess Canadian psychiatry residents’ attitudes to the OSCE was modified for use by psychiatry senior house officers (residents) in the UK. (2) (a1). Thirty-six residents at similar stages in their training and from different medical centers throughout the UK, completed a mock OSCE in the month before the introduction of the Part 1 MRCPsych Clinical OSCE. This examination was conducted using the Royal College of Psychiatrists Guidelines (9). Candidates were tested at 12 stations, each assessing different skills and lasting 7 minutes. Two stations involved a practical task without the need for a patient. All other stations had both an examiner and a patient. Residents were asked to perform one or two tasks at each station—for example, "Take an alcohol history from this patient." Standardized patients (actors) were instructed on their case history, symptom profiles, and representative responses to typical questions. After the examination, all residents completed and returned the questionnaires before receiving any feedback about their individual performances. This was intended to prevent bias. Unlike the Royal College OSCE, three residents were present at each station, one assuming the role of candidate and two observing. Every candidate was subsequently examined at four stations instead of 12.
Of the 36 residents, 12 had previously taken the old Part 1 IPA and were unsuccessful. Nine of these (75%) preferred the OSCE format. Of the 24 who had never previously completed the MRCPsych Part 1 clinical, 12 (50%) preferred the new exam format; 11 (45.8%) did not express an opinion; and 1 (4.2%) thought the IPA was better (t1 and t2-Q10). Twenty-three (63.9%) candidates had previously experienced OSCEs at medical school, for the Professional and Linguistic Assessment Board (PLAB), or other Royal College examinations (t3 and t4). Results show that the OSCE was mostly positively received (t2). The majority of residents felt the OSCE was a fair assessment of clinical skills appropriate to part 1 standards (t2-Q1). Although, 88.9% believed a competent junior resident would pass the OSCE (t2-Q2), fewer believed an incompetent resident would fail (55.5%) (t2-Q3). All 36 (100%) residents believed that the OSCE reflected true situations that residents were likely to encounter (t2-Q5), and 97% thought the actors were realistic (t2-Q6). Additionally, 36.2% thought there was no longer a need for real patients in the MRCPsych exam (t2-Q7). Only three candidates thought that the OSCE should remain confined to the part 1 exam, 19 (52.8%), believing it should also be incorporated into part 2 (t2-Q11). However, only a third of residents thought the OSCE could appropriately assess more senior, part 2 residents (t2-Q9).
Using a 5-point scale (−2 to +2) to represent questionnaire responses, all questions were scored in the same direction (negative representing a negative response, positive representing a positive response). The items were averaged for each candidate to produce a response score. A one-sample t test was performed on these scores (test value=0). The data provide evidence to suggest that overall there was a positive opinion of the OSCE (Mean=0.77 [SD=0.35, 95% CI=0.65—0.88]; t=13.042, df=35, p<0.001).
Candidate comments included the perception that the OSCE felt like "less of a lottery" than the IPA and that it allowed "fairer comparison with peers" and a "broader assessment of clinical skills." Candidates also believed that recovering from a poor performance in one station might be possible in subsequent stations, unlike in the IPA. Negative remarks included feeling under pressure to complete the station tasks in time and exhaustion by the end of the OSCE circuit.
The opinions of these junior psychiatry residents that the OSCE might allow fairer comparison with peers and a broader assessment of skills is a view shared by psychiatry clerks in a study conducted by Hodges and Lofchy (4). The high level of realism reported compares to a study in which 70% of medical students and 94% of examiners noted that the station scenarios were very realistic (10). In this Canadian study by Hodges et al., from which our questionnaire was based, junior residents also thought that scenarios were realistic, whereas more senior residents were less convinced (2). Differences between Hodges et al. article and this study include the spread of the residents’ clinical experience. The Canadian study included residents across 5 years of training, whereas residents had 1 or 2 years, at most, for this study. All the residents (100%) in this survey believed that the scenarios reflected realistic clinical situations compared to 80% of the Canadian residents (2). Eighty six percent of British junior residents thought the OSCE represented a fair assessment for their level of training compared to 53% of Canadian trainees. The Canadian residents endorsed the OSCE as a potential component of their training for junior colleagues rather than senior residents. Only one third of the British trainees agreed that the OSCE would appropriately examine more senior residents despite having endorsed it for themselves. It might be that a more sophisticated OSCE is needed to meet the assessment needs of more experienced trainees, but this hypothesis would need investigation.
Of interest was the high proportion of British trainees (88.9%) who believed a competent resident would pass the examination and the much lower figure of 55.5% who believed an incompetent resident would fail. This might suggest that candidates found the OSCE less challenging than they might have expected or when compared to previous experiences of the old-style IPA.
Pressure to complete tasks in short OSCE stations was apparent and may be at the expense of subtle but important clinical skills such as the development of rapport. Time constraints may also make interpersonal skills assessment challenging for examiners. It has been similarly suggested that advanced competencies, such as therapeutic alliance, transference issues, and synthesizing biopsychosocial factors, can be difficult to properly assess in shorter OSCEs (2). Unlike OSCEs in other specialities such as general surgery, which can rely upon either real patients or actors at stations, the psychiatry OSCE depends entirely upon the performance of actors. Adequate training of the actors is therefore essential, especially for complex presentations such as thought disorder, which are difficult to replicate (5).
While the debate surrounding the OSCE has been somewhat diluted in the UK since its inclusion in part 1, the Royal College maintains that the IPA is an essential component of the membership examination and has no plans to remove it from part 2 despite acknowledging problems (11). It follows that an OSCE component in the second part of the exam seems unlikely in the near future, and having been spared the unpredictability of real patients and rogue examiners in part 1, the risk remains for part 2. The Part 2 IPA was slightly modified in Spring 2003. A longer oral examination was introduced with less time spent on delivering the psychiatric history. Greater emphasis is given to clinical reasoning, decision making, management, etiological factors, and psychodynamic formulation (11).
It is important to note that the opinions of the 36 candidates in this study were not formed by taking the actual MRCPsych examination and that, despite the examination being run strictly according to Royal College Guidelines, individual candidates completed fewer stations. This might indicate that the subjective experience, including the stakes, will have been different despite all candidates being within a month of sitting the real examination. An assessment of residents who recently completed the actual OSCE is one we plan to conduct in the near future.
The introduction of the OSCE appears to be welcomed by this small sample of candidates as a realistic examination method for the part 1 examination. However, more research and further replications are warranted before assuming that these findings reflect the wider population of Royal College candidates. The absence of more senior residents, unlike the Hodges et al. study, may have yielded a more favorable response (2). Moreover, the debate regarding examination format continues in the U.S., Canada, and Australia, and it is not certain that this British experience would generalize to these different contexts.
In the first Royal College Part 1 OSCE, held in Spring 2003, 75.3% of residents passed, compared to 72.5% who passed the IPA in the previous sitting. The overall pass rate (written examination and OSCE) for the Spring 2003 part 1 examination was 41.99%, a 0.36% increase (12). These figures are not likely to encourage candidates preparing to take part 1 of the MRCPsych, but candidates might find some reprieve if they believe it represents a fairer examination.
Drs. Sauer, Santhouse, and Blackwood organize MRCPsych educational courses.