Literature over the past decade has reported the challenges to teaching and learning in medical education related to variability across different clinical settings (1–3). Medical educators have struggled to achieve equivalence in educational program quality across sites, not only by providing balanced educational service, but by striving for the best performance outcomes. According to a recent study conducted in an obstetrics and gynecology clerkship, different sequences of rotations affected national subject examination outcomes (3). Researchers recommended that medical educators consider assessing variability factors in clinical clerkship programs by developing, implementing, or revising clerkship programs prior to assigning sites to students (4). This would help ensure inter-site consistency as one recommended measurement standard for multi-site clerkship program evaluations (5).
Standardized examinations have been a core component of assessing medical students’ competence after they have completed clinical clerkships (6). Professional oversight bodies have challenged medical schools to undertake curricular reforms, including developing additional assessment procedures, that would contribute to measuring improved performance in medical school and later in graduates’ future clinical practice (7). Little is known about the effects of student training in different clerkship sites on educational outcome measures for psychiatry clerkships. Differences in experience between clerkship sites have been difficult to measure. A previous study introduced “inter-site consistency” as one measure of programmatic evaluation for multi-site internal medicine clerkships (5). Our study used the National Board of Medical Examiners (NBME) subject examination as an assessment tool for testing students’ knowledge acquired during a clinical psychiatry clerkship, based on studies of its reliability as an objective measure of student performance (8–10). Studies have shown conflicting results for the effect of site on the final exam score within various primary care and ob/gyn clerkships (11–20).
Our study addressed the following research question: Does the rotation site of our third-year psychiatry clerkship affect medical student examination scores on the National Board of Medical Examiners subject exam score for psychiatry? We examined the relationship between exam performance and both inpatient and outpatient sites for a third-year psychiatry clerkship program at a southwestern university medical school.
We assessed the performances of three consecutive classes of third-year medical students from 2001 to 2004 on the NBME subject examination, administered on the final day of each 6-week psychiatry clerkship. All third-year medical students at the University of Oklahoma Health Sciences Center who completed the required psychiatry rotation between July 2001 and June 2004 were eligible for this study. The study was reviewed and approved by the Institutional Review Board of the University of Oklahoma Health Sciences Center, consistent with policies for publishing educational studies. Data included in the study were:
Students’ final grade in the second-year Human Behavior II course. This grade was based on two multiple-choice examinations, each counting for 50% of the final grade
The clinical site assignment of each student, consisting of one inpatient site and one outpatient site, each lasting 3 weeks during the 6-week course
Students’ final score on the NBME subject exam in psychiatry, which was administered at the end of each 6-week rotation
The final grade from the Human Behavior II course, which focused on psychiatric conditions and their treatments, was used to estimate students’ baseline knowledge. Students’ post-clerkship knowledge was measured using their scores on the NBME subject examination in psychiatry, which each student took on the last day of the 6-week rotation. The reliability and validity of this exam as an estimate of content knowledge have been established (8–10).
We reviewed the records of 439 students, first excluding 153 students for whom we did not have a reported NBME score. These were students who completed their third year of training in the clinical track at the Tulsa campus, and whose scores were not available to us. The remaining 286 medical students were assigned 3 weeks at a psychiatric inpatient site and 3 weeks at a psychiatric outpatient site in Oklahoma City. Some medical students alternated between multiple part-time sites during the 3 weeks, spending 1 or 2 days a week at each site. For the purposes of this study, we limited ourselves to comparisons of full-time sites only. Drawn from this group (N=245), our study sample was selected by random assignments into seven equally sized (N=35) sample groups for the sites. For this exploratory study, the sample size of 35 subjects per group was computed to detect medium effect (d=0.05) in the difference of NBME passing raw scores across clerkship sites, two-tailed, and power 0.80, with a significance level of 0.05.
Each site group for all seven sites represented homogeneous distributions of Human Behavior II scores. This was achieved through systematic randomization procedures using proportional numbers obtained from the defined population’s Human Behavior score distribution in the five ranges: less than 20th percentile, above 20th to 40th percentile, above 40th to 60th percentile, above 60th to 80th percentile, and above 80th percentile. After the random assignment into each group, each site had the subject size distribution of 5, 7, 11, 7, and 5 in the five ranges.
To determine whether the independent variables affected the dependent variable differently, we identified independent variables as inpatient site and outpatient site. The dependent variable for the study was NBME subject examination percentile score. After controlling for students’ prior knowledge in psychiatry at each site, randomly assigned site groups were compared on NBME subject examination scores in psychiatry.
Site comparison was conducted for four inpatient sites and for three outpatient sites by analysis of variance (ANOVA). Directional associations from the site factor (nominal data) to the NBME subject examination percentile scores (interval data) were analyzed by Eta. Effect size of variance analyses were calculated by Eta squares (η2).
Table 1 depicts effects of inpatient clerkship sites on NBME subject examination score. There were no statistically significant differences among the four inpatient clerkship sites (which included one community-based site) on their NBME subject examination scores in psychiatry. The site factor by inpatient clerkship sites did not affect medical students’ NBME performance score outcome.
Table 2 reports the effects of the outpatient clerkship site on NBME examination scores. There was no statistically significant difference among the three outpatient clerkship sites (which included one community-based site) on NBME subject examination score in psychiatry. The site factor by outpatient clerkship sites also did not affect medical students’ NBME performance score outcome.
In this study, the location of clerkship site did not affect scores on the NBME psychiatry subject examination. There were no significant differences between sites on NBME scores when evaluated, using Human Behavior II final grades as a control. Based upon these data, we found that the psychiatry clerkship site was a factor that did not affect student achievement outcome as measured by NBME subject examination in psychiatry. We concluded that our reviewed multiple-site clerkship program satisfied an inter-site consistency standard to produce the convergent outcome of the National Board of Medical Examiners (NBME) subject examination score for psychiatry, considering both inpatient and outpatient full-time sites.
Our results addressed recommendations of previous studies that medical educators should consider assessing variability factors in clinical clerkship programs while developing, implementing, or revising clerkship programs (4) in order to ensure inter-site consistency as a recommended measurement standard for multi-site clerkship program evaluation (5). We believe that this study’s design provides a replicable and feasible approach for other directors of academic programs to determine whether their current clerkship provides equivalent opportunities for learners to achieve on a national standardized test of psychiatric knowledge, regardless of multiple site variances.
The present study selected the NBME subject examination percentile score as the standardized student outcome measure because of its reliability and widely accepted use. Although NBME’s strengths include its standardized assessments of diverse subject areas, it does not assess actual “hands-on” patient care skills. We did not attempt to compare any of these technical and interpersonal skills that are highly variable. We recommend that future studies explore additional measures of student outcome in areas beyond NBME scores for inter-site consistency.
Additional limitations of our study were that NBME scores for students on the smaller Tulsa campus were not included in the analysis, nor were those for the students who were assigned to more than one outpatient site, resulting in a smaller sample size. This may have affected the power of the study. The standard deviations of NBME scores were large, so that small differences in performance would be missed (21). Moreover, we did not examine other factors that could affect student performance, such as timing of clerkships within the academic year or specific site supervisors. Finally, we cannot rule out that the required weekly teaching conference and standardized weekly afternoon lecture series covering psychopathology, psychotherapy, and biological treatments may have accounted for similar performances on NBME across clinical sites.
Multiple-site clinical education has expanded to community-based sites beyond teaching hospitals, making assessment of inter-site consistency an even more valuable clinical educational standard. This standard is a helpful tool for medical educators in diverse disciplines to review their current curriculum and to provide valuable feedback to clerkship site directors and site supervisors relevant to faculty development.
It is an important educational goal to ensure that a uniformly high quality of medical student psychiatric education is achieved in both the academic and community environments. Our study describes a method of continuously assessing inter-site consistency using NBME subject examination scores, using an approach that can be easily adapted to other clerkship programs.