This report describes the development and implementation of the Dartmouth Summer Institute in Psychiatry and Mental Health. The Dartmouth Psychiatric Research Center has been involved in research in evidence-based practices for years and has also been involved in training residents in evidence-based practice. Current faculty, who had never received this training themselves, were often unaware of the components, methods, and benefits of evidence-based practice. As a result, we offered a training series for our psychiatry faculty. This training laid the foundation for the Institute.
McMaster, Oxford, and Duke Universities have established workshops in evidence-based medicine. Oxford and Duke offer small groups within their workshops that focus on psychiatric and mental health issues. Many of the Dartmouth Summer Institute faculty participated in these programs, and our Institute is based, in part, on their models. The Institute is intended for mental health professionals, and its audience has included librarians, pharmacists, and residents as well as practicing clinicians in psychiatry and psychology. Our teaching strategy involves a very strong emphasis on small-group experiences. This approach encourages an active, case-based learning process and takes place in small, comfortable settings. Furthermore, all participants use wireless laptops and have the ability to search, print results, and create critically appraised topics and other related documents. The Institute focus is on basic, practical approaches to learning evidence-based medicine skills that can be directly applied to providing client care.
Specific objectives of the Institute focus on the following skills:
incorporating evidence-based process into daily client care and teaching;
creating a well-built, answerable, clinical question based on a particular client problem;
using the question to find the best evidence in the scientific literature about diagnosis and treatment of mental health disorders;
proficiency with widely available search engines (e.g., PubMed);
appraising the quality and clinical significance of the studies identified;
determining if the results are applicable to everyday decision making in real-time mental health practice;
application of research findings to the care of individual clients;
practicing shared decision making.
Although we focus on the use of PubMed, we provide sessions for using other databases.
Previous research (1), as well as our own experience, has taught us that the primary learning occurs in small-group sessions. At our Institute, the co-leaders have complementary skills and backgrounds to facilitate the learning process. The psychiatrist/psychologist provides a practical clinical perspective, whereas the research librarian brings special expertise in how to use the wide variety of electronic resources and search strategies. The small size of these workgroups enables the co-leaders to provide individualized assistance and mentoring to participants as they work through assigned problems. Each session focuses on acquiring and using a specific evidence-based process skill, with participants usually working in pairs. The use of this small-group format is a core feature of the Dartmouth Summer Institute experience.
The format of the Institute is to guide participants sequentially through each skill component using the following elements of practice-based learning.
presentation of information (knowledge) that includes handouts, articles, and other materials designed to provide participants with new knowledge;
demonstration of the specific skill. The leader might demonstrate how to translate a challenging clinical case into a well-built question amenable to being searched. The research librarian might demonstrate the computer skills necessary to conduct searches for evidence corresponding to the well-built question;
skill-building sessions covering the key steps in evidence-based medicine: developing the well-built question from a clinical case, finding the evidence, appraising the evidence, and applying the evidence to individual clients.
The small-group format provides the opportunity to work with students at their individual level of technical skill and comfort. Some participants come with substantial experience using search engines such as PubMed and are familiar with the MEDLINE and Cochrane Databases. Others only have experience using nonspecific search engines such as Google, while still others have virtually no computer skills at all. Similarly, some participants readily embrace the use of numeric tools in appraising the evidence, while others are uncomfortable with statistics and mathematical operations. A special feature of the small group is the opportunity to coach participants at their own level of comfort, while assimilating core skills that can be used in practice settings.
A personalized learning environment is also supported by providing case material relating to the participant’s area of clinical interest. For example, in 2007 we divided participants into several small specialty groups with topics including child, general, and geriatric psychiatry. Participants welcomed the opportunity to apply evidence-based techniques to familiar clinical situations and appreciated how newly acquired skills could be directly relevant to their day-to-day practice. This personalized approach is explicitly supported on the last day of the Dartmouth Summer Institute, when participants are asked to use their newly acquired skills to address a clinical case that they prepared prior to coming to the Institute. This last step serves as a direct bridge between the Institute and the participants’ clinical practice and the challenges of their clients.
In our initial year (2006), there were 34 participants: residents, practicing academic and nonacademic psychiatrists, psychologists, social workers, librarians, and pharmacists. We created a pre- and post-Institute assessment of evidence-based skills modeled on the Fresno test, a validated test for evidence-based medicine skills (2). Our assessment was modified from the Fresno to suit our focus on mental health. We presented each participant with a different clinical scenario before and after the program and asked questions about the evidence-based medicine approach. We also asked questions about ways to individualize health care for the client and to employ shared decision making. Two independent raters were given scoring criteria and examples for rating each item. Interrater reliability of four practice assessments was high (r=0.92). Each item was scored on a 3-point scale (1=no correct elements, 2=some correct elements, 3=correct elements). The pre- and post-Institute assessment scores and level of significance are presented in Table 1.
A t test was conducted to compare scoring differences before and after the Institute. Because the variables are ordinal, ordinal logistic regression was also conducted as a sensitivity check, and the results were consistent with the t test. The data from 2006 participant pre/post responses indicate significant improvements in six of the nine evidence-based skill areas. These data suggest that our curriculum could help participants improve in identifying criteria to employ in relevant articles, in applying findings to the client, and in assisting clients in making decisions. Furthermore, qualitative comments from participants indicated that they felt comfortable using evidence-based medicine strategies and, most important, they intended to incorporate these techniques into their client care and in teaching medical students, residents, and fellows. The ultimate goal of the learning program is to change practice. This is also a significant challenge to effect and measure. Our hands-on approach to teaching is more likely to produce changes in knowledge scores immediately after the intervention, but whether change in knowledge is associated with change in practice has hardly been evaluated.
During our second Institute (2007), the 24 participants assessed each session on a 5-point satisfaction scale (1=poor, 5=excellent). Mean results are presented in Table 2.
Participant ratings suggest that Institute sessions were well received, with overall mean scores between “very good” and “excellent.” In addition, overall qualitative comments from participants supported these quantitative scores and suggested that learning expectations were met and that the skills learned could be put into practice. Participants said that the Institute was scientifically balanced and free of commercial bias, the physical environment was conducive to learning, and that small-group discussions were very effective. Several residency training directors have attended and used the Institute approach with their residents. At least one attendee started a shared decision-making center in a mental health center modeled on the Institute.
As research in many areas evolves rapidly, information technology will be used to convey new research in a time-sensitive way. Expecting individual practitioners to keep abreast of research or to calculate individual expectancies/risks for positive and negative outcomes will go quickly from difficult to totally unrealistic. At the same time, however, national data systems for producing and disseminating such risk-adjusted information to points of decision-making will also develop rapidly. Today’s cumbersome process of formulating a clinical question, finding evidence, ascertaining validity, and applying the evidence to an individual client will be replaced gradually by better search engines and a streamlined process that will allow clients as well as clinicians to expeditiously obtain the answers to very specific questions.
The skills we are teaching now will continue to be conceptually relevant but will be steadily enhanced by computerized information systems that will reduce the time and burden on clinicians while improving the quality of clinical care. As these systems evolve, teaching evidence-based mental health care will become even more critical.
Dr. Cimpean is a preceptor for the Annual Dartmouth Institute in Evidence-Based Mental Health. Mr. Garrity is a board member of iInTime, a 501(c)(3) that develops CAI programs. The remaining authors disclosed no competing interests.