“…The clinician who becomes adept at scientific clinical examination and reasoning may no longer need to preserve irrational barricades of the intellect to preserve his old ‘art’ against apparent assaults from the new ‘science’ …one of the main purposes of improving science in the clinical examination…is to separate the false art from the true.” —Alvan R. Feinstein (1)
Evidence-Based Medicine in the Residency Curriculum: Worth the Trouble?
Evidence-based medicine (EBM) has been described as a “paradigm shift” in the practice of medicine. A seminal 1992 paper (2) described the traditional paradigm as relying excessively on unsystematic clinical observation, pathophysiologic theory, and content expertise. This was contrasted with the proposed evidence-based paradigm of clinical practice. The latter advises caution in the absence of systematic study, the necessity but insufficiency of pathophysiologic theory, and the need to understand certain “rules of evidence” in applying clinical literature to patient care. Goldner and Bilsker (3) reminded us of the antiquity of this distinction and advocated a similar shift for clinical psychiatry toward a more balanced emphasis on techne, or “abstract laws governing types of patients,” in an area previously dominated by phronesis, or “clinical judgment in a particular case.”
Several prominent voices have called for a similar move in residency training. Evidence-based medicine has been proposed as a “core principle” to guide the reform (4) and assessment (5) of residency education. Indeed, a component of the core clinical skills required by the Accreditation Council for General Medical Education (ACGME) (Table 1
) closely parallels the steps of clinical decision-making articulated in EBM manuals (Table 2
A reasonable assumption supporting such proposals is that teaching EBM skills will facilitate the translation of emerging clinical research into practice (6) and thereby improve clinical outcomes. However, this claim is difficult to substantiate empirically, and EBM-guided approaches to practice have been met with controversy. An exhaustive review of such criticisms (7) includes concerns that this approach tends to devalue clinical expertise, ignores the complex realities of patients’ presentations, and promotes a “cookbook” approach to practice. Critics in psychiatry have also pointed to the inadequacy of the DSM-IV diagnostic system, the problem of unpublished evidence, and the lack of representativeness of study participants (8, 9). There is a dual irony in this very old controversy. First, the prominence of techne, in the explicit language of EBM, has sparked off useful criticism of simplified, data-driven approaches to practice. This has resulted in a clearer articulation of phronesis. Second, attacks on so-called “evidence-based” recommendations often articulate sound concerns about the validity, size, precision, and generalizability of published evidence. These are concepts that EBM is designed to teach. Meanwhile, the rapid accumulation of published evidence only raises the stakes for those wishing to make the best possible clinical decisions.
This article takes this reality as a point of departure: there is a need for more, not less, pedagogic effort to enable trainees to understand the terms of this debate, wade through the clinical literature, and learn to make independent clinical decisions. After all, most clinicians would agree that keeping up with the published evidence is simply a part of sound clinical practice. The question taken up here is not whether but how to envision training toward such self-directed education within a residency curriculum.
The challenges of implementing EBM in psychiatric education arise from pragmatic and conceptual barriers. In the first category are commonly reported factors such as a lack of time and appraisal skills (10) that limit the opportunities for residents to acquire competence in the steps of EBM. Several creative classroom and bedside teaching strategies (11–13) and case studies modeling “real-time” decision making in psychiatry (14–16) have been published to address this abiding challenge for busy clinicians.
In contrast to the attention paid to these important barriers, conceptual confusions have been neglected and will be the focus of what follows. These confusions can derail curricular efforts. For example, extended EBM-inspired classroom discussions about meta-analyses or confidence intervals can unwittingly convey the impression that nonmeasurable aspects of clinical work are best relegated to a mystical and, by implication, less respectable realm. Educators in EBM have also described trainees facing an uncomfortable dissonance between the gratifying certainties offered by the traditional paradigm and the probabilistic, provisional answers provided by the evidence-based paradigm (17). Although the EBM model advocates “integrating the best evidence with clinical expertise and patients’ values” (18), the actual practice of techne and phronesis can seem to be worlds apart. Trainees are then vulnerable to facing a false choice between two caricatures of practice. On the one extreme is the “intuitive” practitioner, impervious to the fuss over evidence, and on the other is the “highly numerate medical [academic, who] belittles the performance of experienced clinicians using a combination of epidemiological jargon and statistical sleight-of-hand” (19).
Curricular efforts would thus benefit from an explicit account of where EBM skills fit within broader educational goals. An inclusive notion of clinical expertise is offered here as one such goal. The project of moving toward this goal can release trainees from misguided battles between techne and phronesis. Both can then be incorporated in a model of reflective clinical practice that acknowledges uncertainty yet facilitates decisions made with the best available evidence. What follows is an attempt to step back and recover a workable notion of clinical expertise for psychiatry; to situate EBM as a vital, albeit limited, part of this notion; and to invite the reader to reflect on the implications for residency education and continuing professional development.
Two clarifications are necessary before proceeding. First, EBM is used here in its original sense: a systematic approach to clinical uncertainty for the individual clinician who wishes to apply the best available evidence to everyday clinical acts such as diagnosis, prognosis, etiologic speculation, or treatment (Table 2
). Teaching EBM thus supports, but is distinct from, the work of disseminating empirically supported interventions or “evidence-based practices” in a health care system. The latter is better discussed within a framework of change management or quality improvement (20). Second, this article is intended as a conceptual background for, rather than a detailed description of, a curriculum.
Clinical Expertise: Practicing the Tasks of Empathy, Interpretation, and Science
What is clinical expertise? Scholars have offered several distinct notions of professional expertise that highlight elements such as technical skill, the ability to apply general concepts, and the ability to reflect on and take a critical stance toward one’s own practice (21). Others have warned of the risk of paying too much attention to individual and cognitive aspects of clinical expertise and ignoring factors in the clinical environment—such as the availability of experienced collaborators—that can aid or limit effectiveness (22). Certainly, the project of clarifying what makes for “expert” practice is far from over and is reflected in clinical legends of masterful diagnoses or interventions that stand beyond educational recipes. In this light, what follows is a limited, workaday conception of clinical expertise that should be judged for its ability to contextualize EBM within psychiatric training.
Frank Fish (23) articulated three conceptually distinct psychologies, interpreted here as tasks, faced by psychiatrists in caring for patients. His own words are offered to introduce the tasks of empathic, interpretive, and scientific psychology.
“By means of introspective knowledge of our own behavior and practical experience of the behavior of others, we develop a special body of psychological knowledge which can be called Empathic Psychology” (23).
Psychological knowledge derives from the first task of the clinician trainee: to actively appreciate the idiosyncratic connections among thoughts, feelings, and actions that constitute another person’s mental universe. We will remain puzzled with a patient’s behavior (e.g., a man who does not adhere to an antipsychotic that previously relieved distressing hallucinations) without eliciting how the patient came to this decision (e.g., his fear that the medicine will make him less vigilant toward a neighbor who he is convinced is spying on him). Supervised exposure to a variety of patients can strengthen this empathic acumen. However, we are quickly called to its limits: even the most empathic listeners can only struggle to grasp experiences that are qualitatively alien to them (e.g., psychotic hallucinations, melancholic depression, or primary delusions). Descriptive psychopathology, or published accounts of the varieties of experience in mental illness (24), are thus essential for the trainee to advance in this task, but clinicians are still limited by what the patient censors (consciously or not) from the dialogue. Clinician-trainees can proceed by applying themselves to the next “task.”
“Interpretive Psychology … [is that in] which the ideas which have been obtained by empathizing with the patient are formulated in terms of some general theory which has been derived from neurophysiology, neurology, philosophy or mythology” (23).
It is not enough to elicit a detailed description of a patient’s subjective landscape. In order to act, the clinician must sift through this rich and diverse database with the help of organizing theories. This is the second, or interpretive, task. Leston Havens (25) described four interpretive models, each with a distinct view on what constitutes relevant data. This notion of having available several “perspectives” has also been elucidated by McHugh and Slavney (26). Trainees can learn to apply distinct theoretical points of view, selectively highlight clinical data, and to formulate discrete interventions. For example, an empathic approach (the first task) can elicit a rich account of the fears and experiences of a patient, which can be sifted (the second task) for evidence of a disease, such as a chronic psychotic disorder, with its characteristic symptoms, course, and exacerbating factors. Additionally, a second theoretical perspective that focuses on the patient’s fantasies and goals in the context of his or her narrative, or “life story” (26), can be invaluable in engaging the patient in psychotherapy. Trainees can also attempt to locate the patient’s personality traits along a continuum so as to better understand coping patterns to particular stressors.
This pluralistic application of multiple interpretive models is championed by Ghaemi (27), and its proposals are worth examining. First, no one perspective or metaphor, be it neurobiological, behavioral, or existential, can claim a complete account of human predicaments that present for clinical attention. From this follows a need for the trainee to become familiar with several approaches. Second, this pluralism discourages an “anything goes” eclectic approach, asserting that in most cases one perspective may be more relevant and powerful than another. For example, a behavioral perspective might offer more leverage than a psychodynamic perspective in the initial approach to an active alcohol addiction. The trainee must therefore prioritize which theoretical approach is most relevant to the particular predicament at hand. Third, there must be explicit awareness of particular strengths, weaknesses, and predictions when a particular theory is being applied. This alerts the trainee to appraise the empirical support for the theoretically driven clinical choices and to monitor outcomes. Such an appraisal is the substance of the third task.
“Empathic and interpretive psychology must be clearly distinguished from scientific psychology … which investigates animal and human behavior in a scientific way and establishes rules and laws” (23).
The unique nature of the scientific task is best illustrated with the concepts of Understanding (verstehen) and Explaining (erklaren), first translated for psychiatry by Karl Jaspers (28). The work of Understanding is the clinician’s task of comprehending the unique subjectivity of the individual patient. It rests on the premise that “behavior means something, that is, it arises with internal consistency from psychological events” (24). It can be made operational for trainees by teaching and guided practice in the first two tasks. Explanation, in contrast, seeks a vantage point outside of the patient’s subjective world and attempts to evaluate causal connections relevant to clinical phenomena.
For example, one young man in our clinic, with no pre-existing psychotic disorder, insisted on continued daily use of marijuana for its “calming” effects. This was a source of considerable distress to his parents, who had read media reports of a possible risk for psychosis and asked for a clinical opinion. This raised the question about his personal risk for psychosis. Although two events might be associated—in this case, the use of marijuana and the initiation of psychotic symptoms—the judgment of whether one causes the other cannot be based solely on an empathic interpretation of the patient’s experiences. We need here to engage in “scientific psychology,” or the third task.
The trainee can be supervised through a systematic approach to this question (presented in Table 2
). A detailed account of this process has been described elsewhere by myself and other authors (14, 15), and each step presents opportunities to teach distinct knowledge and skills. For example, the second step would include a knowledge of “secondary,” or preappraised, sources, such as the journals Evidence-Based Mental Health or Clinical Evidence, and the use of search strategies which attempt to maximize the rigor and minimize the time spent in trying to find relevant information (15). For the third step, the trainee could be guided to fill out the appropriate appraisal worksheet from an EBM manual (29) (Table 3
) or refer to a secondary journal, such as Evidence-Based Mental Health, where the article of interest has been appraised in a brief, structured format (30). The trainee is urged to focus on three questions: how valid, how important, and how applicable are the results? The subsequent steps four and five are summarized in Table 2
. This process often leads to further questions that can be prioritized (31) based on clinical urgency, ubiquity, or interest, and cycled through the same steps. This can add both to the trainees’ knowledge and confidence in seeking out answers.
Multitasking and the Role of EBM
Evidence-based medicine is a set of tools that offers clinicians an explicit, systematic entry into their third, or scientific, task of evaluating causal inferences relevant to such everyday clinical acts as comprehending and communicating prognosis, comparing treatments, and evaluating risk. Although there are few compelling explanations for the etiology of most psychiatric disorders, we can say more about their course, prognosis, and response to treatment. Much of this is derived from clinical studies of groups of patients. The data from these studies, however valid and relevant, must be weighed against individual variability and the particular values, beliefs, and desires of individual patients. Clinical expertise operates in this dialectic between efforts at explanation and understanding (Figure 1
The distance between the particularities of an individual case and data derived from studies of groups of cases is often difficult to traverse and has little to do with EBM itself. Clinical knowledge rarely emerges from one theoretical insight or study but involves a recursive and accumulative process (represented by the curved arrows in Figure 1
). In one direction are models developed from clinical understanding and tested for their explanatory value. For example, once strongly held clinical intuitions, such as the “schizophrenogenic” mother or the value of insulin-coma therapy (32), have been seriously and properly weakened by empirical studies that minimized previously hidden bias. In the other direction, the existing “explanatory” evidence is limited for many complex clinical situations. We then need to rely on the components of understanding: empathy and the judicious choice of interpretive models (whether neurobiological, behavioral, existential, or psychodynamic) to supplement the empirical database. Understanding then remains a critical tool for the clinical expert, both to generate relevant questions for the scientific task, which can calibrate clinical hunches with the evidence (curved arrows in Figure 1
), and to then engage the patient with information gleaned from all the tasks (dotted arrow).
Evidence-based medicine, as represented in Figure 1
, can help trainees evaluate clinical studies and respond to questions about etiology, diagnosis, prognosis, and treatment response. The teaching of EBM empowers trainees to engage with the challenging, imperfect act of integrating results from studies testing relatively focused hypotheses in the complex arena of clinical decision making.
Implications for Training and Practice: On Becoming Bilingual
I have populated a limited notion of clinical expertise as consisting of three specific tasks and I have contextualized the role of EBM within this notion. While this formulation of the clinical expert defines what skills are to be acquired, it also suggests a framework within which to implement and evaluate the process of how clinical learning should occur and where we can look for learning resources. A concrete curricular plan deserves separate discussion, but a broad outline is suggested.
For example, a traditional program with supervised exposure to a diversity of clinical cases and didactics in psychotherapy and pharmacology can facilitate empathic and interpretive skills and background knowledge, but it is inadequate training for the scientific task. However, trainees can be encouraged to formulate focused questions in the context of routine case presentations. This can initiate the first step of EBM. Those questions that are judged to be of importance can then be worked through the subsequent steps. Trainees and training sites can be evaluated on how often and adequately this process occurs, in addition to the traditional supervision of clinical skills. Such a process implies specific attitudes (e.g., welcoming clinical uncertainty as an opportunity for learning), knowledge (e.g., concepts of bias and chance, hierarchies of evidence sources), and skills (e.g., searching electronic databases in a time-efficient manner) that are well described in the EBM literature and can be explicitly assessed.
Why should this notion of clinical expertise be attractive to those charged with designing residency curricula? First, the explicit focus on three tasks will enable trainees to become conversant in the distinct languages necessary for clinical work—the languages of individuals and populations (33), narrative and science (34), or, as I have described them here, Understanding and Explanation. Second, training in EBM will empower the trainee to become a more critical consumer of the scientific literature. The ubiquity of pharmaceutical advertising heightens the need for clinicians who can separate the wheat (probabilistic estimates of effects) from the chaff (marketing). Third, the practice of the five steps of EBM can make operational an approach to uncertainty that has applications beyond the appraisal of clinical studies. Residents conversant with the various types of bias (e.g., in patient selection, attrition, or symptom detection) and the play of chance (i.e., random error) in their clinical data will be protected from making inappropriately strong causal attributions from their clinical encounters. Fourth, EBM can serve as a powerful platform for the acquisition of the specialized content and methodological expertise for those trainees who choose to become clinical researchers. However, I hope to have demonstrated that, far from being an esoteric set of skills of interest only to future investigators, facility with the principles of EBM serves a core function in clinical decision making.
Clinical encounters with individuals suffering from mental illnesses present uncertainties that belie simple recipes. Such moments highlight an ever-present tension between techne and phronesis. If we are to educate residents toward independent clinical expertise, we would do well to welcome this tension as integral to thoughtful clinical work and an opportunity to practice and develop skills in several distinct tasks. Our trainees might then be better equipped to resist premature resolution in favor of explanation or understanding when both are required.