To the Editor: Objective structured clinical examinations (OSCEs) are often used in undergraduate assessment of psychiatric skills (1). The OSCEs resemble future practice more than written examinations because medical students have to show their skills. The OSCEs are also considered more reliable than long case examinations. However, the role of OSCEs in postgraduate assessments is less well established. Especially controversial is using OSCEs to assess complex competencies such as “advanced communication skills.”
Recently, Whelan et al. (2) reported a study about assessment of trainee psychiatrists using OSCEs. The authors compared the scoring of communications skills by actors playing standardized patients and experienced clinicians. The authors found only a moderate correlation between the scores. Therefore, Whelan et al. (2) were not convinced that scoring by standardized patients should become part of the OSCEs in postgraduate psychiatry. However, a different interpretation of their empirical findings is also possible, whereby the acceptability of using scoring from standardized patients depends on the role the OSCEs are supposed to fulfill in the overall assessment.
Parkes et al. (3), in a comment on the study by Whelan et al. (2), stated that many skills can be evaluated reliably with a yes/no checklist, provided there has been sufficient expert input into the checklist. Bergus et al. (4) found that by using a yes/no scale, trained observers could reliably judge communication skills of medical students during a pediatric clerkship. In the study by Whelan et al. (2), no checklist was used. A checklist would probably have improved the reliability of scoring communication skills by standardized patients and clinicians. However, the question is whether this would solve all the problems.
It was also mentioned by Parkes et al. (3) that less clinical knowledge can make it easier to train as an examiner. Clinical experts have developed unconscious intuitions and habits which are useful in a clinical setting but not in an exam. Examinees need specific scoring and feedback and not a low score simply because the examiner felt something was wrong. Some knowledge, however, cannot be formalized in strict rules—so-called “tacit knowledge” (5)—while it is part of the unconscious intuitions and habits of experts. Tacit knowledge cannot be tested using yes/no checklists and will not be assessed at all if using checklists during an OSCE is the only assessment tool.
When Whelan et al. (2) asked standardized patients and clinicians to complete a questionnaire about which aspects they considered important in judging communication skills, there was no difference between standardized patients and clinicians. Standardized patients and clinicians found the same aspects important in communication skills but they scored psychiatry trainees differently. This is consistent with the fact that there are aspects of communication skills (tacit knowledge) which cannot be examined via yes/no checklists.
In testing communication skills more reliably via yes/no checklists one probably disregards important aspects of clinical competence. The question remains whether this is acceptable, especially when testing at postgraduate level. Some kind of trade-off needs to be made between reliability and testing of tacit knowledge.
There are two options. One could develop OSCEs which offer reliable assessments, mainly via yes/no checklists, and accept that they do not test all knowledge, skills, and attitudes necessary to work independently as a psychiatrist. Therefore, other forms of assessment should be used as well (portfolios, case reports, views from supervisors, etc.). With this option, standardized patients could judge communication skills via a yes/no checklist but there would be an understanding that this would only test limited aspects of communication skills.
The alternative would be to develop OSCEs which test broader skills than can be assessed via yes/no checklists and to accept that they are somewhat less reliable. In that case it would be preferable to ask experienced clinicians to judge communication skills globally without specific checklists.
Therefore, the place OSCEs fulfill in postgraduate assessment needs to be clarified first (only form of assessment or not). Once a decision has been made about this, we can judge which is more important, reliability or testing tacit knowledge, and develop appropriate OSCEs and rating scales with this in mind.
At the time of submission, the author reported no competing interests.