Every residency program currently requires that the program faculty and program director evaluate residents to assess their abilities. Typically, program faculty members complete a rating scale for a set of items and add comments. This method of evaluation acknowledges faculty members as providing expert observations, but it has at least three shortcomings. First, faculty members rarely receive training in techniques of effective evaluation. Second, faculty may have few opportunities to make observations. Third, they may not share the evaluation directly with the resident. To address these deficiencies, proposed evaluation methods have included record review, checklists, global ratings, use of standardized patients, and objective structured clinical examinations. Whatever approaches a program chooses, the evaluations must address the Accreditation Council for Graduate Medical Education (ACGME) competency requirements. We describe in this paper how to develop a novel format for evaluating resident competency: the portfolio.
A portfolio is a collection of documents taken from a resident's actual work, chosen by the resident to demonstrate competency in defined areas. Portfolios include self-reflection, which traditional evaluation methods often lack. Some health professions programs report using portfolios in the educational evaluation process. For instance, nursing programs use portfolios to indicate progress in courses (1,2), culmination of degree programs (3,4), and professional development (5). In medicine, programs use portfolios primarily for undergraduates and continuing medical education. For example, medical undergraduates in Scotland complete a portfolio of learning experiences as part of their clinical attachments in community-based hospitals (6). In Canada and the United Kingdom, professionals use learning portfolios as part of continuing medical education (7—10), and clinical educators maintain portfolios to demonstrate teaching skills (11).
Research concerning portfolios in medical education includes a study demonstrating that lower-ability students who kept learning portfolios performed better on an objective structured clinical examination than those who did not keep a portfolio (12). Mathers et al. (13) found that practitioners integrated principles used in a learning portfolio into their practices.
Few have reported on portfolios in residency education. Fung et al. (14) described an Internet learning portfolio. In a learning portfolio, residents identify their learning gaps and describe how they filled them. Lim et al. (15) analyzed the content of learning portfolios in a family medicine residency program in Singapore. Both of these studies described what was contained in a portfolio rather than addressing how the information might be used in evaluation. While primary and secondary educators commonly incorporate portfolios into education, we found relatively few studies in medical education, and none for the purpose of demonstrating competence.
Most evaluation methods are deficient in assessing self-directed learning, self-reflection, and broader areas such as professionalism and ability to respond to system needs to provide the most effective care. Gordon (16) suggests using a process that emphasizes self-assessment for resident evaluation. The strength of portfolios for self-assessment is a frequently cited hallmark of the technique (17—20), and therefore portfolios may capture these elements that are often missing.
The research on portfolios is still evolving. There is evidence that portfolios can be reliably scored, but validity has been more difficult to demonstrate (18,21). Additionally, we recognize that portfolios would not be used alone but as part of an evaluation system that would likely include faculty evaluation, standardized examinations, and other system-specific evaluations such as audits or patient satisfaction data.
When a program chooses to use portfolios, there are numerous decisions to make, including the purpose of the portfolio, what it should contain, and how to assess it. Purposes of the portfolio can include determining placement, progress, and evaluation, and each brings up numerous issues for consideration (17,19,20—23).
We describe here the process of developing a well-structured portfolio for use in a psychiatry residency program, discussing the numerous portfolio-related decisions that we made to illustrate the process. The authors chose a definition and guidelines set by Reckase (24), who defined a portfolio as
a purposeful collection of student work that exhibits to the student (and/or others) the student's efforts, progress, or achievement in (a) given area(s). This collection must include student participation in selection of portfolio content; the criteria for selection; the criteria for judging merit; and evidence of student reflection. (p. 12)
Initial Development of Portfolio Elements
To initiate this innovative resident evaluation technique, the residency program director (Clardy) and an educational consultant (O'Sullivan) formed a committee to address the issues involved in using portfolios, including the type of portfolio, what to include, and criteria for judging merit (scoring). The committee was made up of seven faculty members, a resident, Dr. O'Sullivan, and Dr. Reckase as consultant to the project. The committee met in two consecutive half-day workshops.
The portfolio is a form of evaluation that is realistic to the practice setting and that trained raters can evaluate objectively. Residents can use portfolios to show progress, to provide evidence for transition from one stage of education to another, or for evaluation of their performance. The committee chose a showcase portfolio, representing the residents' selection of their best work, to use for evaluation of competency in specific areas. Therefore, the "purposeful collection" as required by Reckase was to show best work.
To determine the necessary content of the portfolio, we examined the "Objectives of Training" listed in the Program Requirements for Residency Education in Psychiatry (25) and individual clinical rotation objectives. The committee formulated a list of 13 topics that encompassed the Residency Review Committee's objectives. The topics selected were the following:
Residents select cases or experiences that illustrate best work in each area. A portfolio entry would contain all of the necessary documentation to demonstrate the resident's ability in the topic chosen and a cover letter explaining the case. (The question of how many topics a resident should complete per year is addressed at the end of the Discussion section.)
For each topic, the committee developed a definition, a description of documentation to include in the portfolio entry, and resources residents might consider. In some cases, we generated examples. These guidelines served as the starting point for a resident to develop an entry. In portfolio terminology, this guideline for each specific area is called "Criteria for Selection." Colloquially, we might call it "what you need to include."
Criteria for Judging Merit
We chose a holistic scoring approach to evaluate a portfolio entry. Holistic ratings require expert judgment to analyze the rather complex tasks involved in a performance such as a portfolio entry (26). The raters look at each entry overall rather than analyzing constituent skills or tasks, which would require assigning points to specific components. For holistic scoring, our committee developed a general scoring rubric, the specific set of rules used to score an open-ended task such as a portfolio entry. We chose a rubric focused on competency. The rubric defined resident competency in six levels:
The group tailored this scoring rubric to fit each of the 13 topics. For each topic, we developed a single sheet that defined the topic, gave guidance for what to include, listed the scoring rubric, and provided an example of a situation that could be used to generate an entry (see sample in Appendix A).
Evidence of Learner Reflection
A key component of a portfolio as described by Reckase is reflection by the person selecting and presenting the entry. Residents begin each entry with a cover letter. This letter indicates self-reflection and guides the rater. Residents needed to realize that this letter sets an image of their competency. A good letter summarizes the case, points out elements of complexity in the case, and notes how handling of the case may have gone beyond conventional efforts. The letter is the resident's self-assessment of how the entry illustrates a certain level of competence.
Dr. Reckase provided training in a one-day session. The six participants receiving the training were two psychiatrists external to the department and one from the department, the residency program director, a resident, and an educator (P.S.O.). Raters carefully reviewed the scoring rubric to become comfortable with using all levels of the scoring rubric. Ultimately, the two psychiatrists who were external to the department were designated as the ones who would score the portfolios turned in by our residents, since they would be unbiased, knowing neither the residents nor the cases.
Training can center on benchmark examples. Benchmark entries demonstrate what the portfolio entry should look like to achieve a specific level of competency. Ideally, each evaluation level of the scoring rubric for each topic should have a benchmark for training raters. Alternatively, entries generated by residents can be used to train raters. We used a mixture of benchmarks selected by faculty and entries generated by volunteer residents.
The rater trainees reviewed benchmark cases starting at the third level, Competency. Then, the group examined a case reflecting the highest level. They next examined examples between the third and sixth levels. With that background, the process of rating a sample of entries solicited from resident volunteers began. For each entry, patient and resident identity had been obscured, and the raters also were blind to the resident's year of training. The group rated each entry independently and then explained their ratings to the group. As the raters became more comfortable with the process, the group members rated different entries on the same topic prior to group discussion. Dr. Reckase continued training until two raters external to the department and one rater from the department could consistently score entries. Rating an entry takes approximately 20 minutes on average.
The efforts described in this paper are necessary prior to implementation of portfolios within the residency program. Residents need the developed guidelines and the benchmark examples so that they will know what a portfolio entry looks like and how it will be evaluated. Developing these materials took considerable time on the part of the residency program, but the time expenditure was worthwhile for several reasons. First, faculty members seriously examined what was occurring in rotations to determine what were the necessary psychiatric skills and how residents demonstrate them. Second, experience with the first set of portfolio entries involved in training scorers gave the faculty members insight into how portfolios could work in the program. Third, the initial experience provided understanding as to what evaluation data are available from portfolios.
For the faculty, defining the necessary psychiatric skills provided a forum for reviewing expectations from the Residency Review Committee as well as what is actually done in our rotations. Faculty members discussed the opportunity that portfolios provide to amplify discussion with residents about patients. Faculty's enthusiasm and their regular contact with the learners are important in the success of portfolios (27). Also, the faculty reviewed where the education for these necessary skills was occurring within the program. If another program were to use the skills and accompanying guidelines developed in this project, that faculty would need to meet to discuss and review how the portfolio requirements relate to their program and their teaching. However, our development work would save another program considerable time.
Our initial experience with portfolios came from residents who volunteered to do entries for the purpose of training scorers. At first, residents had some apprehension about the time that was required to do entries. As the portfolios become more established as a part of resident evaluation, a resident can complete an entry during each clinical rotation. That way the time to complete the entry is a natural part of reflecting on the care provided during the rotation. Residents report that if they are thinking about the portfolio and collect the supporting documentation at the time of care, assembly of an entry is relatively easy. We placed all of the entry requirements on the resident web page so that residents could view them at various clinical sites where they were providing care and would know what documentation of their work to copy for the portfolio rather than having to retrieve it later. Also, the resident who was on the committee generated a grid for the residents indicating which rotations provided good opportunities for obtaining entries that would demonstrate competency for specific topics.
The portfolios involved the residents more actively in their own evaluation than do more traditional methods of faculty observation or audits. Residents had choices about how to demonstrate their competency. They reflected on the care they provided and had the opportunity to explain their decisions.
The program also benefits from portfolios. For instance, the program faculty recognized, on the basis of several portfolio entries, that they should examine the teaching of biopsychosocial/spiritual formulation. Entries from residents who generally scored well, both on the usual faculty written evaluations and other portfolio entries, were judged as below competent for this specific area. As would be the case with an item analysis where an item writer would consider revising the item under such circumstances, we felt this indicated that either the guidelines were wrong or the teaching of the specific area needed review. Subsequent discussion with the residents indicated that they felt their performance was competent. This led faculty to assess the teaching. As a result, several faculty members are increasing their focus on biopsychosocial/spiritual formulation during their rotations. Other programs could anticipate similar outcomes that will help to improve their residency program.
Portfolios also can be an effective part of general competency evaluation as mandated by the latest ACGME requirements. Our limited experience suggests that portfolios can help in at least two ways. First, some of the specific skills we identified overlap with general competencies. For example, our skill of professional communication overlaps with the ACGME competency of communication. Second, entries can provide evidence for general competencies beyond the specific skill. For instance, a crisis intervention entry also may provide evidence for competency in communication.
There are still issues that a program must address if implementing portfolios. On the basis of psychometric research, we suggest that residents complete five topics per year to make up their portfolio (28). We suggest that the program director or committee select one of the 13 topics and require that residents complete an entry on this topic during each year of training. This gives the program one skill to "anchor" ratings from year to year. For instance, we chose biopsychosocial/spiritual formulation to be done each year by each resident. The resident is free to choose the other four entries for the portfolio each year. The program may require that residents cover all 13 specific topics by the end of the four years of residency.
Drs. Andy Powell and Jay Rankin provided valuable insights while serving as portfolio raters. Val Shue provided editing expertise. The Edward J. Stemmler MD Medical Education Research Fund of the National Board of Medical Examiners provided support for this project through Grant 60-9899 ("Demonstration of a Portfolio Assessment in Residency Education," P. S. O'Sullivan, Principal Investigator). This work was presented at the Association for Academic Psychiatry Annual Meeting, October 6, 2000, Vancouver, BC, Canada.