One of the goals of residency education is to prepare physicians for the independent practice of clinical medicine. This schooling is delivered through the combination of didactics and clinical experience. Psychiatric residency education requires its residents to obtain competence in acquiring medical knowledge, which is one of the six core competencies outlined by the Accreditation Council for Graduate Medical Education (ACGME) (1). The most effective methods of providing resident instruction are not well established. Our residency training program is interested in developing a formal curriculum that enriches residents’ general knowledge in psychiatry, guided by the annual PRITE examination. This standardized test was found to be a “moderate predictor” of performance on the American Board of Psychiatry and Neurology (ABPN) Part 1 examination in psychiatry (2). Webb et al. (2) reported that the national correlation between PRITE scores and subsequent performance on the American Board of Psychiatry and Neurology Part 1 written examination is 0.67 (p<0.01), a “fair correlation.” Smeltzer and Jones (3) examined the validity of the PRITE. They noted that content validity, the degree to which the items on the PRITE are accurate and reasonably representative of the knowledge that is needed to be mastered during a psychiatric residency, was “appropriately evaluated.” The global score on each of the PRITEs evaluated had reliability comparable to standard certification examinations used in medical education. On each test these authors examined from 1980 through 1987, the Kuder-Richardson reliability coefficient for the global score in psychiatry was above 0.90. This is adequate reliability for making comparisons among individual examinees (3).
To our knowledge, most residents prepare individually for the PRITE by simply reviewing prior exams. We could find no published data on interventions for psychiatric residency programs specifically. A literature search conducted on PubMed based on the keywords “psychiatry,” “training programs,” “residency,” “improving general psychiatry knowledge,” and “improving PRITE scores” did not reveal any such studies. On the other hand, neurology and surgical residency training programs have tested program interventions to improve both general knowledge in those fields as well as improve their yearly residency-in-training examinations—the Neurology RITE (4) and the American Board of Surgery In-Training Exam (ABSITE) (5), respectively. For example, Schuh et al. (4) describe one neurological residency training program that initiated an educational intervention: residents prepared weekly presentations and, divided into two teams, they participated in weekly oral quizzes in a game show format. In the Schuh et al. study, the control group were residents who attended faculty-prepared didactic lectures; the intervention group showed statistically significant higher subset percent correct RITE scores and mean yearly percent correct change.
This pilot project—a resident-developed focused educational program—was intended to enhance general psychiatry knowledge as measured by PRITE scores, and to this end we conducted a qualitative and quantitative program evaluation.
We designed a once-a-week course, called the Knowledge Base Review (KBR), composed of nine sessions that led-up to the fall PRITE; participation was mandatory. The American College of Psychiatrists designed PRITE as an educational resource for psychiatric residents and training programs. Each section of the exam focuses on a particular component of psychiatry, offering references to support and explain correct answers. Residents receive a detailed computer analysis of their test performance in comparison with other residents at a similar level of training. Training directors receive results for their individual residents as well as statistical summary data comparing their training program with other groups of participants. Created in 1978, the PRITE consists of 300 questions and is administered in two parts over 2 days; all postgraduate year (PGY) residents take the exact same examination together. To protect confidentiality, scores are returned to the program director, who then gives the test results individually and privately back to each resident. The content areas covered in the PRITE are neurology and neurosciences, growth and development, adult psychopathology, emergency psychiatry, behavioral science and social psychiatry, psychosocial therapies, patient evaluation and treatment selection, consultation-liaison psychiatry, somatic treatment methods, child psychiatry, alcoholism and substance abuse, geriatric psychiatry, forensic psychiatry, and miscellaneous. There are 14 areas that are scored on the PRITE. One area is for neurology, and 13 areas are for psychiatry (12 of these are topic-specific and one of them is a “miscellaneous” section). Scores are reported for all 14 areas, but a global psychiatry score is also included.
Each session was 2 hours long and was equally didactic- and active learning-based. The first hour spotlighted one of the 12 subscales of the PRITE with a slideshow lecture that was both prepared and presented by a senior PGY-4 psychiatric resident from a variety of sources: psychiatry textbooks, psychiatry board review books, and even past PRITE exam explanations to questions. Lecture materials were also provided as hard copies to each of the residents. We encouraged interactive residency group discussion during the first 1-hour peer-presented didactic. Since the time frame available for the course was limited to 9 weeks, the KBR focused on nine of the 12 PRITE subscales that could show room for improvement based on results from the previous year (2007). The three subscales of psychiatry that were not covered in the curriculum were somatic treatment methods, child psychiatry, and alcoholism and substance abuse because these scores were shown to be areas of strength for our training program. Moreover, neurology was not addressed except incidentally during the second-hour game show questions. The innovativeness of this first hour is that unlike other published residency in-training exam review course initiatives, ours was not based on quizzes or residents studying the textbooks.
The second hour, on the other hand, utilized a video-projected question and answer test that paralleled the same format, balance, and subject content of the psychiatry written board exams (7). The game show test was projected upon a very large canvas screen that encompassed one wall of the room using a laptop computer connected to a video projector. Residents were divided into four teams of about seven residents each from a balanced and mixed level of training. Each group chose a team name and actively competed against one another as if on Family Feud and Jeopardy! During the second hour of the Knowledge Base Review (KBR), there were some neurology-based questions (i.e., questions about depression in multiple sclerosis or Alzheimer’s dementia) that randomly appeared on the projection. After a question was displayed, each team convened and collectively selected an answer. An explanation of the correct response appeared on the projection, and it was followed by a group residency discussion about the topic which expanded the knowledge in that topic. Each team accumulated points for correct answers and lost points for incorrect ones. Scores were kept from one session to the next by the residents as the teams competed against one another over the course of the 9 weeks.
Mid-cycle into the KBR course, we collected evaluation forms from all of the residents who participated. We were interested to know if the course was meeting its own objectives, meeting the residents’ expectations, if it was well presented, and if the didactic and game show hours were useful components.
Moreover, data from the year prior to the KBR (2007) were compared to data post-KBR (2008) for 23 residents who took the PRITE in both years. Because of the possibility of year-to-year variability in changes in scores due to nonspecific factors, we analyzed both the global psychiatry scores (which were addressed by the KBR curriculum) and global neurology scores (which were not addressed in the KBR). Our primary hypothesis was that the psychiatry global scores would increase significantly from 2007 to 2008. We secondarily analyzed neurology global scores to identify any possible secular drift in the scores between 2007 and 2008.
Quantitative data analysis utilizing the residents who took the PRITE both the year before and the year after the KBR (N=23 pairs of scores) began by examining the distributions of raw scores on PRITE global subtotals (the global psychiatry and global neurology scores). Analyses were conducted using SPSS (version 11.5, 2002). Since data distributions appeared normal, parametric statistics were used including a one-tailed paired t test for psychiatry global scores (predicted change was upward since the KBR addressed some but not all subscales) and a two-tailed t test for neurology global scores (no prior directional hypothesis since the KBR did not address this content). Data were de-identified by the program director prior to analysis and are reported on percentage change rather than raw scores to avoid the raw scores being taken out of context. This project was undertaken as a quality improvement rather than a research effort, and so informed consent from individual residents was not obtained. This project was approved by the institutional review board.
Participation and Acceptability
Attendance for the course was required, and participation was high. Participation could not be 100%, however, due to scheduled vacations, sick leave, paternity leave, maternity leave, and required time-off for next-day postcall (an ACGME requirement for work duty hours). Our resident survey indicated good to excellent participation and acceptability of the KBR course. Attendance during the 9 weeks among the PGY-2, -3, and -4 classes was 82%, 73%, and 94%, respectively. Eighteen out of 28 residents (64%) completed a survey midway through the course. Sixty-one percent of the residents felt that the course met its own objectives “all of the time,” while 95% responded that the KBR course was meeting their own expectations “all” and “most of the time.” Seventy-two percent of the residents “strongly agreed” that the course was presented in an easy-to-follow and understand manner. One hundred percent responded that the didactic component was useful, and 94% felt that the game show component was useful (Table 1).
Impact on Medical Knowledge
Our primary hypothesis is that global psychiatry scores would increase significantly between 2007 (pre-KBR) and 2008 (KBR) for residents who took the PRITE at those times. We analyzed the global neurology scores to account for year-to-year variability due to nonspecific (i.e., non-KBR) factors, since neurology was not included in the KBR curriculum. Statistical analysis revealed that there was a nonsignificant 2.6% increase in global psychiatry scores (paired t=1.5, df=22, p=0.15) in contrast to a 9.0% decline in global neurology scores (paired t=3.9, df=22, p=0.001). In residents tested in both years, individual psychiatry subscales, some of which were covered by the KBR, did not consistently change in one direction or the other (data available upon request).
This type of resident-developed focused program was well accepted by the residents. An educational intervention in a team-oriented gaming format that emphasized resident performance in front of peers may be useful to residency programs with a strong desire to impact basic and clinical sciences study and learning. For example, a game show format was as effective as standard lectures in teaching medical students about ectopic pregnancy, but rated higher in stimulating faculty/student interaction, helping retain information, and in overall enjoyment (6).
While the KBR course was not designed specifically to address PRITE items, and it did not cover all psychiatry subscales, we considered that this standardized, national test of medical knowledge would be a reasonable tool with which to gauge impact of the course. Quantitative data suggest some impact of the KBR since there appeared to be an improvement in global psychiatry scores despite a concomitant decline in the global neurology scores which were not addressed by the KBR didactic sessions. It is possible, though unlikely, that the KBR off-set a deteriorating program-wide trend in teaching between 2007 and 2008, or that the KBR had no significant effect while the neurology teaching deteriorated. It is also possible that the decline in neurology scores were a function of year-to-year variation that may have also impacted psychiatry scores. These possibilities cannot be differentiated without following trends over multiple years. Interesting, in 2008 the program placed in the top-ten programs nationally in the American Psychiatric Association’s Mind Games competition, which tests a similar breadth of knowledge. Whether the KBR course contributed to this achievement is also unclear.
Nonetheless, the KBR format was feasible to implement with minimal resources and was well accepted by the residents. This pilot experience could be further developed to achieve more impact on the outcome variable of interest—PRITE scores. For instance, a more intensive or broader course may have resulted in more robust improvement in scores, and this is a consideration for future iterations of the course. For example, if the KBR curriculum was longer in duration to encompass all 12 subscales, then there may have been more robust and significant global score results. The Schuh et al. intervention (4) did demonstrate a significant change in their neurological residency program’s scores; it is noteworthy that their course was “20–22 weeks long,” while ours was 9 weeks. It would also be possible to try to improve PRITE scores by linking the KBR course content more closely with the PRITE items. One possibility is to review and teach from previous PRITE exams for the full 2 hours (i.e., teach to the test). However, this was not the intent of the course, which was to enrich residents’ general knowledge in psychiatry as guided by the annual PRITE examination.
We asked for feedback and suggestions from the residents for areas for future improvements in the KBR course. The responses included using old PRITE exams as a pre- and postcurriculum measure to gauge areas that improved and those that still need improvement. Residents also requested incorporating clinical case vignette discussions into the didactic lecture.
In summary, the broad participation and acceptability of the Knowledge Base Review course demonstrates the potential for such a resident-organized and -led intervention to impact acquisition of medical knowledge through an enjoyable and effective approach.
The authors appreciate the statistical consultation of John C. Simpson, Ph.D. At the time of submission, the authors reported no competing interests.