To the Editor: Program directors or those with administrative roles in psychiatric residency programs know well how we do our very best to keep on top of all the requirements and mandates that are sent by the Accreditation Council of Graduate Medical Education (ACGME). We work diligently on keeping the Program Information Form (PIF) updated, completing evaluations, surveys, and final resident reports. Then, of course, come the competencies. Change can always be a difficult thing to digest, but we do what we can. We conformed to the six competencies—and found new and innovative ways to monitor them in our residents and programs.
One ACGME mandate consists of a “program evaluation and improvement plan.” My initial reaction was “What does this mean?” Then I broke it down into two parts: first, we have to evaluate the program, and then if there are concerns, we put a plan in writing to improve upon those concerns. Program directors are constantly making changes, small and not so small, to programs to better the education, training, and learning environment for the residents. A change could be as small as adding a recent article to the reading assignments or as involved as changing a rotation schedule on the basis of concerns that come up during training. These changes are typically made from feedback from residents, staff, nurses, and teaching physicians. It would be very useful to be able to document these program improvement changes in a way that is easily demonstrable to the ACGME during site visits.
Most programs look at issues around program improvement and evaluation annually or semiannually. Often committees are formed specially to look at what program changes should be made—another meeting added to already busy schedules. And in practice, do we really make changes to our programs only once or twice a year? Generally, small changes are made continuously.
At Michigan State University, we developed a system to record programmatic changes as and when they occur, rather than trying to remember the changes at some distantly scheduled meeting. Noting these changes involves documentation of what was done and who was involved and also feedback from the concerned parties regarding whether the changes accomplished the desired goal. Thus, we put together “C-PIE,” one-page Continuous Program Improvement and Evaluation forms on which we record programmatic changes as they occur (sample form available from the authors upon request). We review completed forms and fill out new forms at our monthly residency education committee meetings. We use these forms to document not only what changes we have made during the course of the year, but also what we did on the basis of the program evaluation forms, faculty evaluation forms, and even resident evaluation forms. Most of those evaluation forms tend to be anonymous, and in those circumstances, we specify on the C-PIE that the concern came from an anonymously completed evaluation. The C-PIE can also be discussed at faculty meetings, resident meetings, or any other meetings.
The great advantage of these forms is that we have a continuous, current record of all programmatic changes since the last visit from the Residency Review Committee. The forms remind us of changes made and monitor the effect of the changes. Other advantages are that they are easy to complete, versatile, practical, and more representative of what we as program directors really do on an ongoing basis. Also, the forms document everyone involved in a particular change, can be easily sent to anyone, and can be provided easily to the Residency Review Committee site visitors as proof of actual program evaluation and improvement.
In summary, programmatic changes made to improve educational value or address problems can be documented at the time of change. This not only reflects the truth of what occurs but simplifies tracking and recording and makes demonstrating program improvement and change very easy for Residency Review Committee visits.
At the time of submission, the authors reported no competing interests.