Evidence-based medicine (EBM) “represents the integration of best research evidence with clinical expertise and the patient’s unique values and circumstances” (1). Its adoption as an important organizing principle in all areas of clinical medicine—while not without criticisms (2)—has profound implications for psychiatric postgraduate training. Although the recently revised requirements for residency education in psychiatry from the Royal College of Physicians and Surgeons of Canada and the Accreditation Council for Graduate Medical Education (ACGME) stipulate that trainees in Canada and the United States achieve competency in evidence-based practice, the method by which this goal can be achieved and its implications for training programs are not described. This article attempts to address this issue by considering the meaning of evidence-based competency in the context of psychiatry training, reviewing the available evidence that addresses how evidence-based competency can best be taught, and providing some examples of innovative EBM instruction and assessment methods. We draw a parallel to other aspects of professional development and invoke the concept of the hidden curriculum (3, 4) to help elucidate the necessary training environment.
Evidence-Based Psychiatry: A New Paradigm for Teaching and Learning
To achieve competency in evidence-based practice, psychiatry residents must learn the content and process of EBM. The content of evidence-based psychiatry (EBP) is the sum of the important outcomes of clinical research that should inform psychiatric practice. This database represents essential empirical knowledge about risk, diagnosis, treatment, and prognosis that is needed to practice psychiatry effectively. Although the quantitative nature of this information and its origin in health research differentiate it somewhat from the essential knowledge of other paradigms, its acquisition during training may for the most part involve similar learning behaviors (e.g., reading textbooks and articles; attending rounds and seminars). However, because this essential knowledge is continuously being modified by research advances, residents also need to learn the process of EBP, that is, they must master the tasks of formulating questions and finding answers from the research literature. Engendering this professional behavior of continuously updating and expanding one’s knowledge of the empirical basis for practice represents a profound challenge to psychiatric education.
For example, a senior resident assesses a female adult survivor of childhood sexual abuse who suffers from treatment-resistant major depression. The trainee is competent in evidence-based practice and could be expected to formulate this woman’s presentation by considering the quantitative risk posed by the experience of childhood sexual abuse for the later development of depression, the relative effectiveness of the available treatments for depression, whether treatment outcomes are modified by the experience of childhood sexual abuse, and the likelihood of full recovery given this history. This EBP knowledge sits alongside other important information, such as the meanings that the patient ascribes to her early experiences and her preferences among various treatment options. It would be impossible for any practitioner to know the best current evidence available to address all such issues with every patient, but the EBP trainee could be expected to efficiently and effectively search for the answers and apply them where necessary.
What Works in Teaching Evidence-Based Medicine?
An important question for psychiatry educators is how to effectively foster the attitudes, knowledge, skills, and behaviors necessary for evidence-based practice. By searching MEDLINE and the Cochrane Library (to July 2007) using the search terms “medical education,” “evidence based medicine,” or “critical appraisal,” we identified systematic reviews of the effectiveness of interventions designed to improve students’ ability to practice EBM. During this search, we restricted publication type to “review” and scanned reference lists of selected articles for additional titles. English-language articles were included if they were peer-reviewed and if they reported methods for identifying primary studies. We identified eight reviews (5–12) summarizing a total of 32 primary studies mostly among residents and medical students, including one involving residents in psychiatry (13). One additional review (14) that summarized qualitative data and used a qualitative methodology to analyze the results was excluded.
Several observations emerge from these studies. First, reviewers found serious methodological problems in many or most of the primary studies examined. These methodological challenges, summarized by Hatala and Guyatt (15), include difficulty establishing valid controls free of contamination and cointerventions, small sample sizes that were limited by the size of the medical school or postgraduate program, a low signal-to-noise ratio that reflected the small effect that educational interventions can have on learner behavior, and a scarcity of valid and reliable outcome measures. Particularly striking were the absence of examples where a process or outcome of care was reliably and validly measured as an outcome of the intervention and the heavy reliance on self-report measures of learner behavior.
The second observation is that educational interventions appear to have had a greater influence on student knowledge (an understanding of principles, demonstrated usually on a multiple-choice test) than on skills (an ability to apply knowledge by performing an EBM step) or behaviors (actually performing EBM in clinical practice). These findings are consistent with what is known about knowledge translation: imparting knowledge through training is necessary but usually insufficient to produce behavioral change (16). Opportunities to practice the new skill, coaching, and downstream evaluations of behaviors are also typically required (17), but have generally not been included in the educational interventions described to date.
Third, the outcomes that can be achieved at different training levels may differ. In one review outcomes were stratified according to student level (10), and knowledge of EBM was found to be consistently improved among medical students, but was unaffected among residents. Noting that evaluation is known to act as a major determinant of learning, the authors comment that this result may be attributable to the more direct connection between learning and evaluation in undergraduate compared with postgraduate medical education. Another possibility is that residents may be less able to learn something that they may see as irrelevant if it goes against the clinical culture in which they operate. Put another way, interventions designed to teach EBM may have little chance of success if they do not fit well with the hidden curriculum—that is, the set of influences that function at the level of organizational structure and culture (4). This hidden curriculum, which will demonstrate for students the value (or lack thereof) of evidence-based practice by determining, for example, the extent to which evidence-based practice is modeled by supervisors and peers, likely plays as important a role in the acquisition of evidence-based practice as it does in other aspects of professionalism (3).
The relationship between educational interventions and the hidden curriculum may also help explain a fourth observation: integration of educational interventions into the daily clinical work of students is associated with better outcomes. In their reviews, Coomarasamy and Khan (6) found that teaching methods that were integrated into clinical practice (i.e., based on actual patients and clinical problems) were more often effective than those that were not (positive results in 12/12 versus 16/34 studies, respectively). This result is highly consistent with the broad educational finding that context is critical for learning (6), and leading EBM educators have emphasized the need to integrate teaching with the provision of clinical care to maximize its educational impact (18). It is also possible that the organizational characteristics that create hidden curricula determine the likelihood that a program will adopt integrated teaching methods. If this is true, the observed added impact of integrated interventions may not be as much the result of integration as of differences in structure and culture among programs that were present at the start. Only a study in which whole programs are randomized to integrated versus standalone interventions could clarify the relationships between these variables, but for now it may be enough for educators to acknowledge and make optimal use of both formal and hidden curricula.
Innovative Methods for Instruction in Evidence-Based Medicine
It has been observed that clinical questions occur frequently in the minds of students but are rarely followed by a search of the literature for an answer (19). A qualitative study (20) of internal medicine residents yielded a number of factors that influence whether residents find answers to their clinical questions, including having sufficient time, having a way of remembering and tracking questions after leaving the clinic, having a way of prioritizing questions, the microclimate in which the residents work, their personal initiative, access to the literature at point-of-care, the possession of skills to find answers to questions, and the global institutional culture. A number of innovative (albeit mostly untested) approaches to EBM education may help address these barriers by creating a more tightly integrated educational program and shifting the hidden curriculum toward a culture that fosters evidence-based psychiatry.
Richardson (21) described a model for considering how clinical instructors teach EBM, which may represent a useful starting point for faculty development. First, teachers model evidence-based practice themselves by being explicit when they ask questions, search for evidence, critically appraisal the literature, and when they apply evidence to patient care. Second, teachers weave evidence into their clinical teaching by including the results of health research of risk, diagnosis, treatment, and prognosis in their discussions with students. Third, they teach the skills needed to practice EBM by guiding students in their own formulation of questions and their search for and appraisal of evidence and its application to patient care. The traditional supervision meeting between psychiatry resident and clinical supervisor, with its built-in opportunities for reflection on and follow-up of learning issues, is ideally suited for all three of these modes of EBM teaching.
One method for making better use of questions as they come up is to record them as an educational prescription that indicates the structured clinical question, the time and place at which the answer will be reviewed, and the expectations of the learner’s presentation (22). Another way of tracking clinical questions is to enter them into an electronic database. Two groups have published their experiences asking residents to generate electronic “learning portfolios” of such questions. Fung et al. (23) described the implementation of a system across four Canadian obstetrics and gynecology training programs that logs patient encounters and directs reflection on critical incidents of learning by asking users to enter clinical questions and their answers. Crowley et al. (24) described a database, implemented across two inpatient general medical wards, that residents used to track clinical questions and the critically appraised answers generated while on call. Users reported that the process of asking and answering questions altered patient management in 47% of encounters. This database was designed to allow searching by other users and therefore served an additional function as a clinical resource for house staff.
Although a number of evaluations of traditional journal clubs found no effect on the skills of attendees (5), including the only evaluation of an EBM curriculum among psychiatry residents (13), it is possible that the journal club experience may be made more effective if the process of evidence-based practice is more faithfully rendered. So-called “question-driven” journal clubs, in which members start with an actual patient encounter and proceed through the steps of asking, acquiring, appraising, applying, and assessing, may foster better learning than when papers are selected independent of a clinical problem at hand. Kallen et al. (25) described a variation on the traditional journal club format where fellows and consultants met twice for every article that they considered and, in the second meeting, jointly wrote a letter to the editor with their methodological criticisms and suggestions for improvement. The group’s success of publishing three letters out of nine articles is noteworthy and may have represented an important source of motivation for participants.
Assessment of Evidence-Based Medicine Skills
A systematic review of instruments that evaluate the teaching of EBM (26) documents the growing awareness that a repertoire of valid and reliable instruments for measuring its effects is needed to ensure that EBM education is evidence-based. The authors of that review highlight two instruments, the Fresno test (27) and the Berlin questionnaire (28), for their relatively well-established validity and reliability. The web-based Fresno test starts with two clinical vignettes that suggest clinical uncertainty and asks students to demonstrate how they would formulate a clinical question, acquire the relevant evidence, appraise that evidence, and apply it. This test has the advantage of assessing more than the critical appraisal step, which has historically been the focus of outcome measures but may not be as important with the growing availability of high-quality synopses and other sources of preappraised evidence. The Fresno test has been shown to have high interrater reliability, internal consistency, and discriminant validity (27). The Berlin questionnaire is a 15-item multiple-choice test that measures EBM knowledge and skills and has been shown to have content, discriminant, and responsive validity (28). It has the advantage of being more easily scored than the Fresno test but asks students to demonstrate skills in fewer EBM steps. Both instruments represent ways for training programs to document EBM competency among individual trainees. Alternatively, one group has published the feasibility and results of an EBM observed structured clinical examination station (29). The student was asked by a standardized patient to provide evidence supporting a clinical decision that he or she had just made. The student was then instructed to formulate a clinical question, perform a brief literature search in real time, appraise a standardized article, and then present the bottom line to the patient. This instrument is unique in providing a method of assessing communication of evidence to the patient, but lacks the demonstration of robust psychometric properties that would be necessary for its use as an evaluation method. An important next step for educators in psychiatry is to adapt these or similar instruments to the mental health context and establish their feasibility and psychometric properties among trainees in psychiatry.
The way in which health research is brought into clinical encounters with patients will evolve as information technology continues to transform the knowledge-to-practice pipeline. However, the relatively recent emphasis on the use of best evidence in guiding health care will likely remain an important feature of psychiatry in the foreseeable future (30). The question of how best to train residents of any specialty to become evidence-based physicians remains important and largely unanswered at this time. Still less is known about whether and how this task may pose unique challenges to psychiatry, although the near absence of studies in psychiatric contexts suggests an opportunity for more research in this area. What does seem clear from the available data is that teaching residents to practice EBM is not straightforward. To be incorporated into the behavior of students, evidence-based psychiatry likely needs to occur in the routine clinical work of residents. It therefore needs to be adopted as an expectation of clinical care, modeled and valued by supervisors and regularly evaluated, in addition to being formally taught in seminars and journal clubs. Thus achieving evidence-based competency among faculty will be a necessary condition of achieving evidence-based competency among trainees, a goal that naturally poses its own formidable challenges. If producing evidence-based psychiatrists is the goal, postgraduate programs need to examine, integrate, and align their formal and hidden curricula to ensure a learning environment that fosters evidence-based practice.
The authors thank Karen Saperson, Allyn Walsh, Robert Zipursky, and the anonymous reviewers for their helpful comments.