The Accreditation Council for Graduate Medical Education (ACGME) recently emphasized improving resident evaluation and, to that end, has tightened the accreditation requirements in this area. The ultimate goals of these changes are to improve resident education and residency programs overall. Between 1998 and 1999, the ACGME Outcome Project Advisory Group developed a plan to help residency programs address these new requirements. The Outcome Project Advisory Group's objectives were to develop a set of general competencies, help programs to identify and develop dependable methods for assessing performance within the competencies, and provide examples of dependable evaluation tools (1). The Advisory Group identified and defined six core competencies that are intended for application to all specialties: patient care, professionalism, interpersonal and communication skills, medical knowledge, practice-based learning and improvement, and systems-based practice. It is implicit that the competencies represent basic yet essential qualities or skills that all residents are expected to display. Further, it is assumed that the competencies should naturally be fostered in routine training within all medical specialties.
As of July 2002, programs must develop and/or implement evaluation methods beyond traditional faculty global evaluation that address and evaluate resident performance within these competencies. Each specialty-specific Psychiatry Residency Review Committee (RRC) is expected to define the competencies and adopt an evaluation approach to fit its specialty. For many programs, this means implementing new assessment tools. Developing such new tools from scratch is a monumental task and would be impractical and unrealistic for most residency programs. Rather, programs may reassess methods they currently use to determine whether they can meet the ACGME expectations. Such an approach is necessary because it is critical that programs adapt to these changes expected from the ACGME. Alternatively, programs need access to reliable, practical, and comprehensive tools they can adopt to fulfill their own specific needs as well as to meet the expanding requirements of the ACGME. The portfolio may be one tool programs might consider using.
An underlying assumption of a given evaluation method is that it reflects the learner's true ability or skill. Unfortunately, not all evaluation methods depend on actual learner work. Standardized patient examinations, simulations and models, and objective structured clinical examinations are, to some degree, artificial. The "ACGME Toolbox" (2) identifies the portfolio as an established method of resident evaluation. One valuable attribute of portfolios is that they reflect actual learner work. Although there are limited reports of their use in medical education evaluation (3, 4), portfolios have been defined as purposeful collections of work that exhibit a learner's achievement in a given area (5).
Prior to the ACGME Outcomes Project and the Toolbox, the University of Arkansas for Medical Sciences (UAMS) psychiatry residency program integrated portfolios as one approach to evaluating residents (4). A committee of faculty and residents within the department used the RRC requirements for psychiatry to identify 13 psychiatric skills (a1). All residents completing the UAMS psychiatry residency must demonstrate competence in each of these areas by completion of training. Residents select their best work to showcase in the portfolio. A portfolio consists of individual entries that demonstrate the resident's abilities within each of the 13 skill areas. An entry is a collection of documents that reflects actual resident work within a skill area and may be comprised of chart documentation, laboratory or radiology records, literature searches, and other relevant data. Also included in an entry is a cover letter that is, most importantly, an exercise in self-evaluation. The portfolio is also an opportunity for residents to explain how the entry demonstrates their best work and to clarify, acknowledge, or describe any potential shortcomings of the entry. Residents receive guidelines, specific to each psychiatric skill, that suggest what to consider and include for each entry. The guidelines provide the criteria for evaluation and a scoring rubric ranging from a low score of 1 to a high score of 6. First-year residents must complete two portfolio entries, and upper-level residents complete five. Further details on the development of portfolios for this residency program are described elsewhere (4).
By the time the ACGME institutionalized the general competencies, portfolios were a well-integrated evaluation tool within the UAMS Department of Psychiatry. The 13 psychiatric skills identified as portfolio entry topics were selected because they broadly represent skills the department believes are necessary to safely and competently practice general psychiatry. Analogously, the six ACGME general competencies broadly apply to all residency programs and all medical specialties. We hypothesized that the criteria for competence within each psychiatric skill were broad enough that portfolio entries would naturally reflect resident performance within the six general competencies. If these assumptions were true, it would be desirable and practical to use an evaluation tool that not only serves the needs of the psychiatry department but also fulfills the expanding ACGME requirements. Would it be possible, therefore, to use the portfolio as a valid method to address these general competencies? The purpose of this study was to determine whether portfolio entries, designed to evaluate essential skills in psychiatry residency training, could also reflect resident performance within the six general competencies defined by the ACGME. If we could demonstrate this outcome, then others could adopt the portfolios used in this residency program for their purposes.
During the 2000—2001 academic year, 18 of 22 (82%) PGY-1 through PGY-4 level psychiatry residents at UAMS participated, completing either two (PGY-1) or five (PGY-2 through PGY-4) portfolio entries. Residents submitted a total of 80 entries.
To aid the reader, a description of an entry is presented here. The objectives for a "legal issues" entry are to demonstrate an adequate understanding of the legal system as it relates to physicians' obligations within it and to demonstrate skillful management of the legal aspects of clinical cases. A resident would select a challenging case from the current year that best demonstrates fulfillment of these objectives. The case may involve, for example, matters of involuntary admission, 72-hour holds, cases of suspected abuse or rape, decisions regarding instituting searches for patients who have eloped, interventions with intoxicated patients planning to drive or with patients making specific threats, and consultations regarding capacity. To complete the entry, the resident would provide photocopies or other reproduction of actual documents from the selected case—including psychiatric evaluation, progress notes, legal documents such as 72-hour holds or court documents, letters of correspondence, or any other relevant material—as well as a cover letter that summarizes the situation, the specific legal obligations, and any interventions taken by the resident. This completed set of documents, or "entry," would then be submitted for review by the raters.
Two board-certified psychiatrists rated the entries. They were instructed to score the entries and to establish interrater reliability during training. Rater bias was eliminated by obscuring all patient and resident identifiers before submission and by using raters who were unassociated clinically with UAMS, blind to both the cases and the residents and unknown to the residents. They were paid $20 per entry.
In addition to evaluating the entries according to the department's established criteria and scoring rubric, the raters completed a report form concerning the general competency content of each entry (F1). The form contained the ACGME definition of each competency. By checking none, some, or definite, the raters assessed whether the portfolio entries reflected resident performance within each of the six competencies. Raters did not make a judgment about resident performance within the competencies per se, only whether an entry reflected information about each general competency. We assigned scores of 0 for none, 1 for some, and 2 for definite. The raters had a 71% agreement as to whether or not they could make a judgment. The kappa coefficient was 0.31, which is considered fair (6). We averaged the scores of the two raters for the 80 entries. Statistical analyses were descriptive, calculating the median rating for each competency by type of entry. Overall percentages were tabulated to help determine if portfolios can be used to measure general competency.
Raters scored 80 entries representing all 13 psychiatric skills. The biopsychosocial formulation was the most common entry type, with 21 entries submitted. There were only two self-directed learning entries and two treatment modalities entries. t2 provides the median response category (none, some, definite) for each psychiatric skill by competency. Ten of the 13 skills reflected at least five of the competencies. Among those 10, the one competency not reflected was practice-based learning. For example, the psychiatric skill of crisis management provided inference about patient care, medical knowledge, communication and interpersonal skills, professionalism, and systems-based practice. The biopsychosocial formulation, self-directed learning, and teaching and presentation skills entries reflected fewer than five of the competencies.
t2 also summarizes which psychiatric skills reflected each general competency: 1) 100% of the psychiatric skills reflected at least some evidence for medical knowledge; 2) 92% of the psychiatric skills provided at least some evidence for patient care, communication and interpersonal skills, and professionalism; 3) 77% of the skills provided at least some evidence for systems-based practice; and 4) 31% provided at least some evidence for practice-based learning.
Our data suggest that portfolio entries may reflect resident performance within the ACGME general competencies. Portfolio entries represent all competency areas. The medians indicate which competencies were best illustrated by which skill. Although the entries vary by which competencies they reflected, six of the 13 reflected all general competencies except practice-based learning. This is true for crisis management, legal issues, medical psychiatry, neuropsychiatry, treatment course, and working with teams and families.
As for whether portfolios could reflect resident performance within all of the general competencies, consider a set of five entries consisting of biopsychosocial, initial evaluation and diagnosis, medical psychiatry, professional communication, and treatment course entries. The data in t2 reveal that this set of five entries adequately reflects resident performance within all general competencies except practice-based learning.
The practice-based learning competency was rated overall the lowest of the six competencies as being reflected in the portfolio entries. Because the portfolio project at UAMS was implemented before the changes in the ACGME requirements, the portfolio guidelines and 13 skills definitions were not originally designed to relate to the general competencies. It is possible that improving or modifying the portfolio guidelines with the competencies in mind could transform other entries into sources of inference about practice-based learning. Additionally, practice-based learning could be reflected in the overall process of selecting entries and constructing the portfolio. A cover letter discussing how and why the individual entries that constitute the portfolio were chosen could better reflect performance within this competency.
Another chief consideration in this study concerns competency definitions. We used the ACGME definitions of the general competencies rather thanthose adapted by the psychiatry RRC. It is likely that tailoring the definitions of the general competencies to better apply to psychiatry would improve the level of inference embedded in the portfolio entries. Each specialty-specific RRC is expected to define the competencies to fit the needs of the specialty (1). Were this to happen, perhaps our professional communication entry would provide more inference about the professionalism competency than it does currently. Or, perhaps a greater number of entries would reflect more about practice-based learning. Thus, modification of both the general competency definitions and the portfolio entry guidelines could transform portfolios into a more robust tool for evaluating resident performance within the general competencies.
In addition to the issue of competency definition, there are other limitations to this study. First, ratings regarding the competency content of the portfolio entries were based on actual entries and may have been constrained by the quality of the entries. Second, although there were 80 entries rated for the study, six of 13 skill areas were represented by three or fewer entries. Larger sample sizes would likely provide more robust data. Third, we used only two raters. It is possible that using three or more would strengthen our data. Finally, these data are based on findings at one institution within one specialty. It would be beneficial to conduct similar studies at other institutions and within other specialties.
In summary, portfolios are just one method of evaluation. While it would be desirable from a practical point of view for a single evaluation method to address all the competencies, oftentimes this may not be possible. Thus, it is reasonable and realistic for residency programs to use more than one form of evaluation to examine resident performance. In our case, other methods could be used to address specific competencies that are less thoroughly addressed by portfolios. In fact, the Outcome Project Advisory Group expects residency programs to use multiple assessment methods in monitoring resident performance (7). Nevertheless, as accreditation requirements continue to expand and evolve, it is becoming more important and critical that residency programs, regardless of specialty, have access to more sophisticated tools that serve multiple purposes. This study demonstrates that portfolios are an effective method not only for evaluating resident performance within specific skill areas but also for reflecting resident performance within the ACGME general competencies.