The field of psychiatry is growing rapidly. Breakthroughs in neuroscience, statistics, and genetics have challenged our field to integrate a vast knowledgebase of biology and emotion into a new science. In such an environment it is logical to invest our resources into getting more psychiatrists into the laboratory and training them in the newest techniques while at the same time honing their clinical skills. However, many potential researchers are lost to our field because they lack a few basic skills necessary for scientific exploration. Namely, they lack the ability to frame questions in a way that they can be meaningfully answered by the scientific methods they have at their disposal. Although scientific knowledge and technical skills will ultimately be a prerequisite for any successful scientist, learning to ask answerable questions is an important first step and one that is often overlooked (1).
Any scientific pursuit must begin with asking a question in a way that it can be answered (2). One can reduce the basic notion of statistics to turning a clinical or laboratory experience into a quantitative statement (3). However, often the process of medical education gives little practice in this endeavor. Residents come to us after years of being told what to think and what to study. Their medical education is often permeated with rote memorization of facts in order to get good grades.
Yet, throughout their lives, physicians are asked to do research in their daily practice of medicine, although few would define it in that way. One could view every treatment as a small experiment. The experiment starts with gathering data, the history and lab results. Next, one develops a hypothesis. A hypothesis, in the case of a single patient, is usually the best guess at a diagnosis. On the basis of this diagnosis, a treatment plan is developed. The treatment plan could be considered an experiment to test the hypothesis. However, perhaps most critical to any hypothesis, or treatment, for that matter, is what is done if the experiment doesn't turn out the way we expected. We either examine the experiment for potential error in how it was done (Did the patient take the medication the way it was prescribed?) or develop a new hypothesis (or diagnosis) based on what happened with the first experiment.
Over the past 10 years, the authors have developed a seminar series that tries to put together these basic steps of doing research. It is designed to provide basic building blocks—building blocks that might be easily overlooked in the crush to provide more technical knowledge to our young investigators of the future. It goes to the core of the nature of science—namely, asking questions that can be answered with the tools at one's disposal. Once the data are collected, it guides the resident in interpreting the results. This requires an openness to the possibility that things may not have turned out the way one expected (4). On a more pragmatic level, it requires the ability to write up those findings in a way that they can survive the careful and critical scrutiny of colleagues in the field and still get published. Below, we present some of the short-term outcomes of teaching this seminar, as well as some thoughts on its long-term impact.
The seminar is hierarchically organized as 12 one-hour lectures. The first eight or nine lectures take the resident through the different types of study designs, from case reports to chart-review studies, to clinical trials, to laboratory experiments, and large-sample prospective and retrospective epidemiologic studies. The final one-third of the seminar explains how to write up and publish one's findings and how to critique someone else's manuscript (see t1). One of the unique and useful aspects of this seminar is that it teaches form and process rather than content. Consequently, the content of the readings for each lecture can be altered to keep them current and to highlight the expertise of the seminar facilitator or individual lecturer. For instance, in the case of the primary facilitator and author of this course for the past 10 years (MTP), most of the readings are in genetics and obsessive—compulsive disorder (OCD). This allows the facilitator to focus on the general form of a particular lecture, say the issues and structure of doing a clinical trial, and not to get caught up or distracted by the findings. The general goal of the course is to both demystify and desensitize the resident to doing research and make the resident comfortable with the thought that they too can contribute to the scientific literature. Over the 10 years it has been taught, the seminar has gone through some modifications to reflect the ever-changing needs of the residents who have taken it. However, the basic format has remained the same, namely, to include both critical reading, thinking, and writing, and the 12 core topics as 1-hour lectures, has been maintained, as outlined in t1.
A writing and publishing experience is an important part of this seminar. This is stressed beginning with the first and second lecture, which examine the important parts of any research study: introduction, methods (both subjects and procedure), results, and conclusion. These first two lectures also discuss the value and limitations of an "N-of-1" study. Later in the seminar, by Lecture 10, the resident is asked to write a "Letter to the Editor" for publication. Some residents have written about side effects, and others about a novel treatment that did or didn't work. The focus of the letter-to-the-editor writing experience is to think of it as a small anecdotal research paper. To start, it must contain an introduction, wherein one cites the existing literature. The methods section includes two parts, the subjects and the procedure. In the case of the letter to the editor, the subject is usually a single patient, and the procedure is usually no more complicated than the use of a specific medication or the occurrence of a surprising result or side effect. The results section simply reports what happened to the patient. Finally, the conclusion or discussion section offers some explanation or hypothesis of why the result occurred. If possible, the discussion should also cite the current literature to support the hypothesis. Since the letter-to-the-editor format has a word limit, it has the added benefit of making the writer be succinct. This is good practice for writing manuscripts and grants in the future, where rambling prose or verbosity will often distract from presentation of the results or diminish the likelihood of publication.
This seminar has been facilitated and taught by one of the authors (MTP), nine times over the past 10 years, at three different institutions: 1) SUNY at Stony Brook, Stony Brook, NY (twice: n=16 residents), 2) Brown University, Providence, RI (three times: n=26 residents); 3) SUNY at Buffalo, Buffalo, NY (four times: n=32 residents). Thus, a total of 74 residents have taken this seminar with MTP as seminar facilitator and core teacher. Incorporated into the syllabus for this seminar are two short-term outcome measures: a self-assessment survey and an objective test, both given twice, at the beginning and end of the seminar. The self-assessment survey (available on request from MTP and with special thanks to Carol McLeod) includes nine core questions: seven questions about estimated knowledge of writing up a scientific report, data management, doing literature searches, statistics, and one question each on value and interest in research. The objective test (available on request from MTP, with special thanks to Tana Grady-Weliky) includes 17 multiple-choice and true—false questions about measurements, statistical tests, and research design. These instruments have been used consistently throughout the seminar over the past 4 years at SUNY—Buffalo. However, before this time and at the other institutions where this seminar has been taught, the data have been inconsistently collected and/or are not available to these authors, so they will not be reported.
Over the 10 years, the concepts and content of this seminar have been presented at the American Association of Directors of Psychiatric Residency Training (AADPRT) meetings in January of 1993, 1996, and 1998; the Association for Academic Psychiatry (AAP) meetings in March 1997 and October 1999; and the American Psychiatric Association (APA) meetings in May of 1997, 1998, and 1999, the last as part of a CME course on doing research for general psychiatrists. Consequently, it has been adopted in part or in toto by several other institutions from time to time over the 10 years. These institutions have included (but are not limited to): Duke University—Durham, NC, Columbia University—New York, NY, University of Cincinnati—Cincinnati, OH, UCLA—Los Angeles, CA, and University of Hawaii—Honolulu, HI. As noted above, outcome data were not readily available at these sites at the time of this writing.
In addition to the outcome data from formal assessments, the number of letters to the editor both submitted and published has been tracked informally, through communication from former participants in the seminar with the facilitator, MTP, about their subsequent publications, from all three institutions: SUNY at Stony Brook, Brown University, and SUNY at Buffalo, where she has taught the seminar.
The results of the objective test were analyzed with a one-tailed paired t-test, anticipating that the course would improve resident knowledge about research and statistics. Pre- and post-test data were available on 24 of 32 residents at SUNY—Buffalo who took the seminar between 1996 and 1999. Improvement was significant, with an increase in mean test scores from 69.1% (±15.0) to 74.4% (±11.0; P=0.04; df=23).
In terms of the self-assessment survey, pre- and post-measures were also available on 24 of 32 residents at SUNY—Buffalo who had taken the course over the same 4-year time period. The first seven questions on this survey are rated on a 1—5 scale: 1=poor, 2=fair, 3=average, 4=above average, and 5=excellent. To calculate the impact of the course on their self-assessed improvement in knowledge, these first seven questions were totaled for a possible maximum total score of 35. Again, a one-tailed paired t-test was done because it was anticipated that the residents would improve their knowledge from the beginning to end of the seminar. The results were again significant, with a change in mean total score from 17.4 (pre-) to 23.9 (post-; P<0.0001; df=23).
The self-assessment survey questions 8 (value of research to your subsequent career) and 9 (attitude/ interest in doing research) are rated on a 1—10 scale, where 1 represents no value/interest, and 10 represents very valuable/very interested. These two questions were analyzed separately with two-tailed paired t-tests because we could not readily predict what the impact of the seminar would be. This analysis revealed no significant change in the self-assessed "value" of research, which in fact, was rated as high (7.8) before the seminar and only increased to (8.7) after the seminar (P=0.10; df=23). However, the interest in doing research did show a significant increase after the seminar, with a change in mean rating from 7.5 to 8.4 (P=0.01; df=23).
Yet another outcome measure is the number of Letters to the Editor submitted for publication. Informal data are available on 68 of 74 residents who have taken the seminar since its inception; six residents who have just completed the seminar at the time of this writing (December 1999) have not yet had time to submit their letters and get them published. Thus, of the 68 residents followed over 9 years, at least 23 have submitted letters for publication, and 13, more than half, have been published. Journals in which they have been published have included The American Journal of Psychiatry and The Journal of Clinical Psychiatry.
The research goals from this seminar are rather modest. On the basis of the objective test and self-assessment survey, residents do seem to improve their basic core knowledge in research methodology and statistics. Perhaps even more gratifying is that the residents' interest in doing research actually increases, although their perceived value of research, which was already high, did not significantly increase. Anecdotally, in the years this seminar has been taught by the author (MTP), more than one resident has commented on the mastery they now feel in reading the research literature. Some have sheepishly admitted they no longer need to skip over the Methods section and just assume the study was done correctly.
The Letter to the Editor experience has also been both a good exercise in writing up findings and a stepping-stone for discussing issues of the risks and benefits of anecdotal findings. It has stimulated discussions about random error, chance, systematic errors, and bias, that can enter any type of research, as well as discussions of the generalizability of findings from one patient to research samples to the typical patient seen in practice. However, perhaps most gratifying to the residents has been the excitement of their first publication and the knowledge that they too can contribute to scientific knowledge in our field. It is hard to estimate what the value over a lifetime might be from this first successful experience, but one can only assume it is a positive one.
A fourth potential outcome measure for this seminar would be to follow up with those residents who have taken the seminar and see where their careers have taken them. Anecdotally, at least two or three residents over the years have surprised themselves by pursuing research careers. To truly measure the impact of this seminar, however, a group of residents at a comparable institution who did not have the seminar over the same time period could be used as a valid comparison group. Plans for just such an impact study have just begun.
No one needs to be left out of doing research. It can be an enriching and invigorating process for anyone who pursues it. It also makes us better doctors, because knowledge in our field will be ever-changing, and we need a way to acquire it. Recently, the Accreditation Council for Graduate Medical Education (ACGME) has recognized a need for training programs to ensure that all residents are trained in research methodology and statistics. In addition, the Residency Review Committee (RRC) requires most residencies, including psychiatry, to incorporate research skills into their curriculum. This seminar might appear impossible to set up at a given institution or in a given department, especially one that is not particularly strong in research. However, typically, some topics related to research design and methodology are already being taught in any given training program. If someone is designated as seminar facilitator and simply organizes the existing lectures in sequence, it can often be easy to see what is missing in terms of topics (see t1). Then one can draw upon the expertise of his or her own faculty and the broader medical school faculty to teach and to make the experience of doing research seem real and doable to any resident. Thus, less attention should be paid to the specific topic of the research and more to the form or process of the research. At SUNY—Buffalo, plans are already under way to take the syllabus from this seminar one step further and to adapt it to other medical disciplines besides psychiatry. Given the focus on form rather than content, the lectures can be easily adapted with different readings so they are relevant to residents in medicine, pediatrics, surgery, and other specialties.
For our field to continue to grow, we need to encourage and train more residents to pursue research, whether it be simply to report some clinical findings in a systematic and meaningful way or participate in more complex basic science or multicenter research endeavors. We are optimistic that this course can act as an important first step on this path.