The American Association of Directors of Psychiatric Residency Training (AADPRT) Task Force on Quality in Residency Training previously published a general definition of quality as it applies to residency training (1). In short, that definition emphasizes outcomes relevant to the needs and reasonable expectations of "stakeholders" of training programs (e.g., residents, faculty, prospective residents, patients and families, funding agencies, and professional organizations). A report from the Task Force describing additional background to the present report has been accepted for publication (2).
The next step in the evolution of this view of quality in residency training was the development of outcome measures, of which stakeholder satisfaction is an important example. Because several reports have emphasized the influence of current resident satisfaction on the recruitment of future residents (3—5), and because no reliable, valid, and generally accepted resident satisfaction questionnaire was available, the Task Force undertook to develop a resident satisfaction questionnaire.
What determines resident satisfaction with a program? Skodol and Maxmen (6) surveyed 71 residents in four psychiatric training programs concerning professional role satisfaction, theoretical orientation, perceived technical competence, and need for supervision. Role satisfaction among residents increased during training and correlated with perceived competence and decreased need for supervision. Although this study did not directly address resident satisfaction with training, the inference is that the degree to which programs increase perceived competence should increase satisfaction with aspects of training programs related to residents' acquisition of technical competence.
Haupt et al.(7) surveyed 31 residents for factors contributing to satisfaction with an idealized training program. The five most important factors related to resident satisfaction were 1) quality of attending teaching; 2) feeling of esprit de corps; 3) degree of responsibility for patient care; 4) quality and number of conferences; and 5) outpatient experience.
Several studies found the philosophy of the training program, especially the degree of eclecticism, to be important to applicants as a criterion for selecting a training program (8,9). Other surveys have considered factors important to resident applicants in choosing a program (see, for example, References 3 and 5). A number of factors extrinsic to the quality of the training program emerged, including geographic location, spousal satisfaction with the community, and opportunities for employment after graduation.
This report describes the process used by the task force to develop a measure of satisfaction for residents in training, a measure based on a large national sample of psychiatry residents. The Residency Satisfaction Questionnaire (RSQ) offers training directors a more reliable and valid instrument than most programs have the resources to develop independently, and it could be used to compare satisfaction data over time and across programs.
A list of factors related to residents' satisfaction with the quality of training programs was generated from a review of the literature cited above, resident focus groups, and training-director focus groups. Fifty psychiatric residency training directors listed factors they believed residents would consider most important in determining satisfaction with the quality of residency training. Residents (N=15) in two training programs participated in similar focus groups. Although items generated by training directors and residents were very similar, care was taken to include items generated by one group even if they were not generated by the other. Items generated from the literature review and focus groups overlapped to such a degree that additional resident or training director focus groups were not thought to be necessary.
A total of 41 items from the literature review and focus groups comprised the initial survey given to residents to determine their perceptions regarding factors determining the quality of training programs (t1). Two "Other" items were included to encourage residents to add items not already listed.
Not included in the items submitted to residents were factors extrinsic to training programs (e.g., geographical location). Although such factors are of great importance to resident applicants, most of them are not under the control of the training program and would be of lesser value in helping programs to improve residents' satisfaction with the quality of training.
The 41-item questionnaire was submitted to residents in 38 programs that had indicated a potential interest in participating in the development of the Resident Satisfaction Questionnaire. Because interest in participating was indicated only by signing a list circulated during the focus groups, a high degree of participation after the meeting ended was not anticipated. Sixteen programs returned questionnaires completed by their residents. A description of the 180 residents who completed the survey is shown in t2; the participants appear to be a fair representation of residents in training. Participating programs covered all geographic regions in the United States and one Canadian region.
Residents were instructed that the purpose of the survey was not to determine their current satisfaction with residency training. They were asked to indicate (on a 5-point Likert scale) the importance attached to each factor in determining their satisfaction with residency training (with 5 indicating a factor of great importance). Information regarding residents' backgrounds, interests and orientations, and program descriptions were recorded (t2).
The following five items were considered most important by the overall group of resident respondents in determining residents' satisfaction with training programs: 1) quality of supervision; 2) respect of faculty for residents; 3) responsiveness of the program to feedback from residents; 4) balance of training between psychosocial and biomedical aspects of psychiatry; and 5) morale in the department (t3).
The top five factors overall were also the five most important factors listed by American medical school graduates (AMGs; 107 responses). International medical school graduates (IMGs; 68 responses) included four of these as their most important factors: quality of supervision, respect of faculty for residents, responsiveness of the program to feedback from residents, and morale in department. Personal qualities of the program director were included among the five most important factors by international medical school graduates.
Among female residents (92 responses), the top five items were the same as the top five from the overall group. Among male residents (85 responses), the top five items included quality of supervision, respect of faculty for residents, education prioritized over service, balance of training between psychosocial and biomedical psychiatry, and morale in the department.
In considering residents in different years of training, the most striking finding was the inclusion by first-year residents of the item regarding level of support from peers. Additional data on year-by-year comparisons are available from the authors, as are data comparing responses between IMGs and AMGs.
Considering residents according to their indicated primary interests or theoretical orientation (biological, psychological, or eclectic), residents indicating a primary interest in either biological or psychological aspects of psychiatry were more likely to consider training in biomedical psychiatry among the top indicators of quality in a residency training program. All three groups—biologically-, psychologically-, and eclectically-oriented residents—indicated that a balance of training between psychosocial and biomedical aspects of psychiatry was important in determining the overall quality of the training program.
Resident Satisfaction Questionnaire (RSQ)
The final list of 10 items for the RSQ included items most important to the overall group of residents, as well as items considered to be among the top five factors by any individual subgroup of residents. (See Appendix 1: Resident Satisfaction Questionnaire.)
Several items considered among the most important by residents were not included in the final RSQ. Because confusion might exist in interpretation of the item "balance of training between biomedical and psychosocial aspects of psychiatry," we opted to include on the RSQ separate items related to satisfaction with biomedical and psychosocial training.
Two closely related items were ranked highly by residents: responsibility given to residents for patient care, and progression in level of responsibility. Because of the overlap in content, only the item with the greater test—retest reliability was included on the RSQ (responsibility given to residents for patient care).
Residents, especially international medical graduates, considered personal qualities of the program director to be important in determining satisfaction with training. Although we included this item in an earlier draft of the RSQ, it was later deleted. We believe that although the item would generally be used fairly and favorably, its inclusion might be problematic for some program directors, especially those involved in difficult personnel decisions.
Only 10 items were included in the RSQ, so as to keep the questionnaire as brief as possible. We believed a longer RSQ, being less convenient to complete and more burdensome for recording and analyzing data, would be less likely to be used.
A pilot of the RSQ was administered twice to 15 residents in one program, separated by 2 weeks. Test—retest reliability for each of the items on the final version of the RSQ was tested via correlations and paired differences for each item. The correlation coefficients ranged from 0.38 (P=0.2992) to 0.92 (P=0.0001). The correlation coefficients and their respective P values may be found in t4. The results of the paired differences revealed no statistically significant differences between the test means and retest means (t4). [All tests of correlations and differences between means were exact-permutation tests; hence all P values are exact P values (10).]
A 10-item Resident Satisfaction Questionnaire (RSQ) has been developed based on a review of the literature as well as items generated by a broad sample of psychiatry training directors and residents. The RSQ reflects items considered by residents as most important in determining satisfaction with residency training programs. The questionnaire has excellent test—retest reliability and should be useful in measuring resident satisfaction with a training program.
The items rated most highly by our sampling of residents are consistent with the 1987 findings of Haupt et al.(7) with a smaller sample, findings suggesting that these are stable values for psychiatry residents. In particular, the special emphasis placed on peer support by first-year residents in both studies suggests that the use of peer support groups in the first year by some training programs is well founded.
Our results are also consistent with the recently published report by Daugherty et al. (11) of a survey of satisfaction among 1,277 second-year residents. Daugherty et al. found resident satisfaction to be increased with greater opportunities for learning (e.g., more contact with faculty) and decreased with greater resident perception of maltreatment (e.g., humiliation). These two broad factors, learning opportunities and learning environment, are well represented in the RSQ. Satisfaction with the quality of supervision and teaching, with the responsibility given to residents for patient care, and with training in biomedical and psychosocial aspects of psychiatry (Items 1, 2, 5, 7, and 8) reflect the educational opportunities available, whereas the respect of faculty for residents, responsiveness of the program, priority given to education, department morale, and support from peers (Items 3, 4, 6, 9, and 10) reflect the educational ambiance.
In using the RSQ, we recommend that program directors bear in mind that results are most useful as probes suggesting the need to conduct more in-depth investigations. Thus, if a particular item shows that there is marked dissatisfaction or a declining trend in satisfaction among residents, focus groups with residents would probably be the next step toward understanding the problem or problems. Residency education committees also ought to be aware of this and should be familiar with and endorse the RSQ before its introduction. An "Other" item has been included on the RSQ to permit programs to evaluate aspects not otherwise covered. We recommend at least a semiannual use of the RSQ to uncover emerging trends. Responses to the RSQ should be gathered anonymously and returned to a person not likely to identify a particular individual's responses. Responses should be aggregated for use by the program director and residency education committee.
The RSQ, with minor modifications, has been used for evaluating various rotations within a program. It can be used as an outcome measure when a rotation or curriculum is revised and can provide early warning when departmental changes produce unintended alterations of the educational milieu. Programs with comparable groups of residents and other characteristics may want to share results for benchmarking purposes. We are interested in soliciting feedback from programs using the RSQ regarding its usefulness and drawbacks.
One limitation of the RSQ is its lack of benchmarking data. For example, is an overall satisfaction rating of 3.5 "good" or "bad"? How might a given satisfaction score compare with scores from other programs? Does overall satisfaction tend to change with time in training?
Other research questions might be addressed as well. How does resident satisfaction correlate with other putative measures of program quality, such as PRITE scores, Board pass rate, and departmental research funding? Items on the RSQ seem to tap two major domains: educational opportunities and learning environment.—Are there other, less crucial domains, that contribute to resident satisfaction and dissatisfaction as well? Do other stakeholders, such as faculty, program funders, patients, and future employers, concur with the opinions of the residents as to the most important indicators of program quality? The task force plans to look at these and other questions in the future.