The efficient and accurate assessment of clinical performance is a substantial challenge in continuing medical education (CME) and other educational settings. Currently, educational outcomes measurement in many CME programs is imperfect (1), as outcomes are often assessed through brief postparticipation questions that measure either mere participation or knowledge of isolated facts. A recent review of CME assessment activities at a large academic medical center found that over 80% of educational assessment tools measured only Kirkpatrick level 1 outcomes (2) (satisfaction/reaction), and that no educational assessment items measured outcomes related to clinical practice or patient care outcomes (3).
The dilemma when attempting to assess higher-level educational outcomes is that, on one hand, closed-ended multiple-choice questions and other commonly used assessment tools are unlikely to reflect clinical practice, while on the other hand, more comprehensive tools such as simulated patients or chart reviews are time-consuming, expensive, and simply not practical in most settings. However, clinical case vignettes that allow open-ended free-text responses are a promising assessment tool that may assess clinical performance in a more efficient manner. These vignettes present a clinical case in several parts, and responses are scored with a standardized system generated a priori using extant literature and other sources of expert guidance.
Such clinical vignettes, first utilized by Peabody et al. (4–6) to assess the performance of primary care physicians and internal medicine residents, appear to be essentially equivalent to standardized patients and chart audits in assessing actual clinical behavior in the management of common outpatient conditions and the provision of appropriate preventative care (4). However, case vignettes with open-ended questions and algorithmic scoring of free-form responses have only rarely been used in CME activities, (7) and to our knowledge have not yet been used to assess clinicians’ ability to evaluate and treat patients with psychiatric illness (i.e., to perform clinically relevant needs assessments). In this article, we will describe the development and implementation of clinical case vignettes to assess guideline-adherent assessment and treatment of patients with bipolar disorder and first-episode schizophrenia among clinicians attending psychiatric CME activities.
In 2008, the Massachusetts General Hospital Psychiatry Academy (MGH-PA) held live symposia on mood disorders and other topics in psychiatry (spring 2008, in four cities) and schizophrenia (October 2008, in one city). These symposia included live didactic lectures and interactive case conferences. In spring 2008, a clinical case vignette was used to assess knowledge regarding the management of a patient presenting with bipolar mania and bipolar depression; in October 2008, a vignette assessed workup and management of a patient presenting with a first episode of psychosis. Prior approval to collect and analyze learners’ responses to the vignettes was obtained from the Partners Health Care Human Research Committee.
Clinical case vignettes were created based on the model of Peabody and colleagues (4–6); see Table 1 for sample vignette; both vignettes available from authors. The vignettes were separated into several sections, beginning with a broadly described scenario with multiple diagnostic possibilities that gradually became focused to allow for questions about management. For the spring 2008 symposia, the vignette outlined a patient with dysphoria and labile mood, whose diagnosis was unclear even after further evaluation. After more diagnostic information was obtained, it became clear that the patient had a diagnosis of bipolar disorder. The vignette went on to describe the patient first having a manic episode and then bipolar depression. For the symposium on schizophrenia, the vignette described a patient initially presenting with probable psychotic symptoms who, by the end of the case, was diagnosed with schizophrenia. In both cases, a faculty expert on the topic created the clinical case vignette, and the MGH-PA educational outcomes team (JH, RB, TP, LB) then edited the case for clarity, focus, and clinical relevance.
Attendees were asked to write free-text responses to open-ended questions at strategic points during the case. In both cases, the first two questions inquired about differential diagnosis and the next steps in the evaluation of the patient. Attendees were then provided with a diagnosis for the patient, and the final two questions asked attendees to make management decisions, such as type and duration of treatment.
To generate the scoring system for the free-text responses (see Table 2 for examples; full scoring algorithms available from the authors), the educational outcomes team reviewed published practice guidelines on the assessment and treatment of bipolar disorder and schizophrenia, along with a published review of first-episode schizophrenia (8). Once created, the scoring system was further refined and modified through discussion between the outcomes team and the faculty expert who designed the case.
Vignette Completion and Scoring
A 10-minute period was reserved for this educational assessment activity at the outset of the CME event, and the vignettes were completed and collected at that time. A research assistant (SR), who had received education about the pertinent psychiatric conditions and the vignette, collected the forms and transcribed responses from the vignettes into a spreadsheet. Educational outcomes team members (JH, RB, TP) independently scored the responses using the previously created scoring algorithms and met to discuss discrepancies in scoring until consensus was gained.
For the mood disorders symposia, 374 participants attended the event, with 192 fully completing the vignette. Data on professional discipline was available for 163 of the 192 respondents: the majority of respondents were physicians (n=110, 67.5%), most of whom (n=89) were psychiatrists. The disciplines of other respondents included nurse (n=30, 18.4%), psychologist (n=6, 3.9%), physician assistant (n=5, 3.1%) and other nonphysician (n=12, 7.4%).
For the October 2008 symposium on schizophrenia, 62 out of 81 clinicians fully completed the clinical case vignette questions. Complete data on professional discipline was available for 53 respondents; the majority of respondents (n=34, 64.7%) were physicians, almost all of whom were psychiatrists (n=32). The disciplines of other completers included nurse (n=16, 30.2%), psychologist (n=1, 1.9%), and other nonphysician (n=2, 3.8%).
Scoring of the vignettes proceeded as in the methods section. The research assistant successfully transcribed responses for all subjects into a spreadsheet, and scoring consensus was gained by the educational outcomes team for all responses. Once scores were tabulated, a summary of group performance on the vignette was created and emailed to registered participants shortly after the live event.
We found that it is feasible to administer and score clinical case vignettes on the assessment and management of bipolar disorder and first-episode schizophrenia as part of a CME program. Vignettes were introduced at the outset of the program, and most respondents appeared to finish the brief activity in 3–4 minutes. Though participation was more limited in our first attempt in spring 2008, at the schizophrenia program three-quarters of attendees completed this optional assessment tool and an expert-approved scoring algorithm for free-form answers was efficiently utilized.
Other groups have used case vignettes as part of board licensing examinations or continuing education, but have not utilized the vignettes in combination with free-form responses scored in a standardized manner using evidence-based practice guidelines, as per the methods of Peabody and colleagues (4–6). This method of educational assessment appears to evaluate clinical performance in a more comprehensive and real-world manner than traditional educational assessment tools, but to our knowledge has not been used as a needs assessment tool in psychiatric CME.
Several challenges remain in the use of these vignettes in CME. First, creation of the vignettes and scoring system requires knowledge of the clinical material and practice guidelines. Second, there must be a designated person responsible for scoring the vignettes who is able to make decisions about ambiguous answers. Finally, there must be a method to provide feedback to respondents about their performance. One could argue that this is a substantial effort for educational assessment related to CME. However, there is evidence to suggest that current CME programs may not even assess higher levels of learning (e.g., related to clinical performance) to determine whether or not CME is having an impact on patient care (1). Though this method of vignette-based assessment does require effort, it is a much more practical method than other established techniques used to assess patient care performance (e.g., chart reviews), and with further refinement could be made even more efficient. Based on our experience with both vignettes, between review/scoring of the answers by a clinician (approximately 1 minute/respondent), meetings to gain consensus on complex/vague answers (approximately 20 minutes), and finalization of scoring (approximately 20 minutes), we estimate that this process for a group of 60 participants would take less than 2 minutes per person.
These methods could also be extrapolated to other settings. For example, clinical case vignettes could be used in didactic sessions with residents and medical students in smaller-group settings to provide a structured, clinically based approach to needs assessment or (if used both presession and postsession) learning. Indeed, in these settings, the process—with a smaller number of participants and with the lecturer able to score the vignettes and provide rapid feedback—may be even more efficient, and such vignettes may be particularly well-suited to this venue.
Our preliminary findings about the use of vignettes have several limitations. First, we had only a moderate number of participants, and participation was initially low. Though these vignettes with free text responses were equivalent to chart reviews/standardized patients in other settings, we did not utilize a comparison methodology (e.g., multiple-choice test) in this feasibility study of these vignettes, and it is possible that a less complex outcomes tool may have gathered the same information. Finally, in future evaluations of this approach, it would be optimal to use more than one vignette per topic and to have a greater understanding of the demographic and practice patterns of participants to better understand the effectiveness of this approach.
In conclusion, our exploratory investigation suggests that it is feasible to utilize clinical case vignettes (with systematic scoring of free-text responses) to assess knowledge and performance related to psychiatric CME. Such vignettes can be integrated into educational events, completed rapidly by participants, and scored efficiently by staff. This strategy has the potential to serve as a practical method of assessing higher-level outcomes in CME and other educational settings. Such methodology could be important, given the evidence that traditional didactic-lecture-driven CME may not lead to meaningful or rational changes in clinical practice (9, 10) and given the limited assessment methods used in many CME programs. Additional studies of this approach, utilizing multiple vignettes, a comparison methodology, and a larger and better-characterized sample, are required to further define the utility of this methodology.
Drs. Huffman, Petersen, Fromson, and Birnbaum receive honoraria payments from Reed Medical Education (a publicly traded company working as a logistics collaborator for the MGH Psychiatry Academy). The education programs conducted by the MGH Psychiatry Academy are supported through Independent Medical Education (IME) grants from pharmaceutical companies co-supporting various components of the overall program along with participant tuition. In the past 12 months, these educational programs have been sponsored by: Astra-Zeneca, Bristol-Myers Squibb Company, Eli Lilly and Company, Forest, Janssen Medical Affairs LLC, Pfizer, Sanofi, Shire, and McNeil. At the time of submission, Ms. Romeo, Dr. Baer, and Ms. Sutton-Skinner reported no competing interests.