To a great extent academic medicine concerns itself with data and outcomes. Some of these data are related to the development of new knowledge, some to the management of our patients, some to the mission of education, some to the management of academic careers, and some to quality assurance. Technology has enabled us to track, manage, and report these data with increasing ease, making transparency and accuracy more achievable. The purposes of this article are to describe 1) two complementary technology systems used in academic medicine to improve quality; 2) the barriers and obstacles encountered in implementing these systems; and 3) how technology facilitated positive change for the organization.
Medical Student Portfolios: My Oleport (My Online Electronic Portfolio)
No one would question an architect’s or artist’s need for a comprehensive portfolio to demonstrate the depth and breadth of his or her ability. Traditionally, evaluation of the medical students has relied on written exam scores, objective structured clinical exams (OSCEs), oral exams, 360-degree evaluations from patients, nurses, and peers, and narrative reports (usually very brief) from faculty. These student evaluations were typically completed at the end of a course, with a copy kept in folders in a locked cabinet in the registrar’s office. Faculty advisers were required to request copies or go to the office to examine a student’s folder. Those few students in academic difficulty found that a school official was indeed following their progress as they were called into the Dean’s office. However, for those students who performed at or above average, final grades provided the only solace that they were moving forward.
Missing from the student’s folder is any attempt on the part of the student at self-assessment or peer-assessment. This means that faculty are limited in their ability to determine whether a student can engage in lifelong adult learning, including the ability to identify weaknesses and develop a focused learning plan to address those weaknesses—two skills that are crucial to the practicing physician. To address these deficiencies, some medical educators have come to rely upon Web-based electronic portfolios (ePortfolios) as a data management system, a method to stimulate reflective learning, and an assessment tool. These tools also facilitate communication and provide more effective mentoring, particularly when geographic distances make communication difficult. ePortfolios are intended to encourage systematic reflection by mandating that students maintain narrative logs that foster a critical examination of their performance, and attempt to manage their own professional development with feedback from mentors.
The ePortfolio drives the interaction between a faculty mentor and advisee at least three times a year. Two weeks prior to a required meeting with their mentor, students are sent an e-mail asking them to log in to the Oleport and address newly posted self-reflective questions. Some questions relate to career development (Figures 1 and 2) and others ask students to engage in reflective learning, as is the case following an OSCE. Each structured entry requires students to address three domains with regard to particular assigned graduation competencies or objectives: 1) what are the weaknesses/strengths in achieving specified competencies? 2) what is the learning plan to achieve competency? and 3) how will the student identify outcome(s) measures to demonstrate competency? As a simple example, students might watch a video recording of their exam of a 55-year-old man with back pain. After watching the encounter with a skill assessment guide, the students are asked to 1) identify the strengths and weaknesses of their exam; 2) develop a learning plan to improve their exam of a patient with back pain (e.g., “I will find a neurologist to demonstrate an evidence-based approach to the back pain or review an online tutorial of the back exam”); and 3) identify outcome measures to prove competency (e.g., “I will have a neurologist observe my exam of the back and sign off that I am competent”). At the conclusion of each ePortfolio session, the entry is e-mailed to the faculty mentor who is encouraged to respond to the student in writing. In any case, mentors are required to meet with their students to review their reflection and approve learning plans.
The ePortfolio as a Learning Tool
Reflection should be a crucial part of adult learning. It involves an active recollection and assessment of past events intended to encourage adults to learn from experience. It is no surprise that adults benefit from continuously looking back on their actions/behavior and analyzing them in the context of how things might have been improved. Ideally, reflection is the beginning of a cycle and is followed by practicing new approaches and then again reflecting back with the goal of continuous quality improvement.
The ePortfolio is an online tool intended to track and manage this reflection. If used appropriately, this reflective tool helps students develop more in-depth understanding of how and why they learn, and it asks them to consider alternative approaches to learning that might be tried in the future. It is important for the students to receive regular feedback on the specifics of their reflective learning (calibration) that fosters a critical attitude toward their performance and helps them plan and manage their own development. Remedial plans are needed to address the needs of those students who lack the critical ability to appraise their strengths/weaknesses. Measurable outcomes will be necessary to document impact.
Limitations of the ePortfolio
ePortfolios are effective learning tools for most, but not all, learners. To achieve maximum effectiveness, certain conditions must be met, including 1) a good explanation of the ePortfolio, including its goals/objectives and what is expected of the student; 2) clear user-friendly technology; 3) appropriate follow-through with a mentor; 4) appropriate faculty development to ensure strong coaching skills; and 5) a summative assessment (1). Even when each of these goals is met, faculty go to considerable lengths to convince students of the benefits of reflective learning. While providing structure is crucial to helping a student learn self-assessment skills, too much structure can be stifling and reduce reflective learning. We have also found that reflection for the sake of reflection without direction is not fruitful. Requests for students to “log on” to the ePortfolio system need to follow clearly identified clinical experiences where reflection would be of benefit, such as home visits, hospice visits, OSCEs, the end of clerkships, or career planning exercises. Following written reflection, the students receive timely discussion and feedback on the quality of their ePortfolio entries and on their identified learning plan.
Though students object to being graded on their portfolios, faculty feel that completion of the portfolio is an important part of the students’ assessment in the domain of professionalism. Completion of the portfolio requires time from both the student and faculty. In recognition of this, we provide students with a pass-fail summative assessment on reflective learning, and we encourage faculty to provide narrative comments to encourage students (and faculty) to take the exercise seriously. Merely developing an ePortfolio is no guarantee that reflective learning will occur. All stakeholders (faculty, students, and administration) must clearly understand its purpose and objectives.
Faculty Portfolios: MyInfoVault
For many years, American medical schools have had a growing interest in improving methods to document and manage information related to faculty accomplishments for promotion and tenure evaluations, including creation of faculty portfolios to document teaching contributions. Many of the publications that describe faculty portfolios are limited to documentation of accomplishments in the educational mission (2–7).
The University of California (UC) has well-established requirements regarding preparation of faculty portfolios (often referred to as packets or dossiers) for faculty merit and promotion reviews. These requirements address accomplishments in all missions (education, research, clinical service and university/public service) and include specifications on what information can and must be included and the format in which it is presented (8,9). UC faculty members are required to undergo a merit review every 2 to 3 years and review for promotion after a maximum of 7 years as an assistant professor and 6 years as an associate professor.
Each major step requires preparation and submission of packets which must contain:
Both faculty and staff complain that such frequent submission of so much detailed information and creation of the required packets are time-consuming and often redundant, since the faculty are requested to submit the same information for different purposes multiple times each year.
To address these issues, in 2002, the University of California, Davis, School of Medicine created a mission-based reporting (MBR) program. Our MBR program has been reported previously and was a Web-based program in which faculty members report their activities on an annual basis (10–14). We believed faculty members would be more likely to participate in MBR if the information collected could be saved in a data repository used for other purposes as well, such as preparation of their merit or promotion packets or a grant submission, thus minimizing the requests from a department for information and the effort required to provide this. Though MBR was ultimately discontinued after a pilot program in our school, we did develop an online faculty data repository, MyInfoVault (MIV). Implementation of MIV is currently in progress. It is capable of producing three documents: a packet in the required format for faculty merit or promotion reviews, a curriculum vitae for personal use, and NIH Biosketch Form for grant submissions. All MIV pages are constructed dynamically based on the user’s profile. The system is written in Java, an industry de facto, object-oriented Internet programming language. The major design concept of the MIV is to use the central data repository with a variety of integrated applications that generate a series of professional reports and documents. Complete information on MIV and its development is available on the MIV homepage (14).
To enter data, the faculty member or a designated assistant logs in to gain access to the MIV system. Figure 3 illustrates the general data entry screen, and Figure 4 illustrates the screen for publication entry. After the data are entered and saved, they can potentially be used in any of the three MIV applications: CvOnline, PacketOnline, or NIH Form. Since not all data elements entered are pertinent to each application, the user can choose which elements to include, and these choices are saved. Formatting options also exist and the packets for merit/promotion can be sent directly to the different levels of review through MIV. Each level of review can add its components into MIV.
As of January 2006, over 2,000 individual faculty accounts were created in MIV from 75 different departments across the UCD campus. These include all 25 departments in the School of Medicine, in addition to departments in the School of Veterinary Medicine, College of Agricultural and Environmental Sciences, College of Biological Sciences, College of Letters and Sciences, School of Education, and College of Engineering.
As is typical of implementation of most new systems, frustrations were noted regarding the learning phase, largely due to unanticipated “bugs.” Additionally, the initial data entry to create an MIV record was considered to be time-consuming and tedious. Uniformly across the school, department academic assistants performed the data entry. Very few faculty members have entered data or maintain their own records. In general, academic assistants have reported data entry and packet preparation to be no more difficult than most other computer tasks they regularly perform, and this opinion was shared by both first-time users and experienced users. They found that finalizing and electronically sending packets for review to be very easy. Departments found it particularly advantageous to have packets saved in a central repository that only required updating at review time. Prior to MIV, frequent staff changes caused repetitive loss of packets from previous years, necessitating re-creation of entire packets each time a faculty member was undergoing a review.
Department analysts responsible for preparing packets and the School Personnel Committee members were surveyed regarding their experience using MIV packets for academic reviews. Questions related both to experience preparing packets and to experiences reviewing completed packets. Respondents answered using a scale of 1 to 7 with 1 defined as “easy” or “better than conventional methods,” 4 as “the same as conventional methods,” and 7 as “worse.” The ability to generate the packet in the standard format, appearance, and readability was rated by all users to be at least as good as, and often much better than, the packet prepared by conventional methods, and found that MIV packets significantly conserve space and paper. Most members of the School Personnel Committee, which reviews packets for academic advancement and the Associate Dean for Academic Affairs, find electronic review of packets to be more convenient and easier to find and abstract pertinent information than the conventional paper packet. School Personnel Committee members also felt that having packets available electronically to all members facilitated their group discussion since paper packets had previously only been printed for the primary and secondary reviewers. All users share the opinion that MIV is a very favorable concept with much future promise, though all feel that additional functional enhancements are necessary, particularly regarding increasing speed of the system. Few faculty members have as yet taken advantage of the curriculum vitae or NIH biosketch features in this initial stage of implementation.
Both the ePortfolio and the MyInfoVault have required change and buy-in from stakeholders. Development of both tools required considerable time and effort over several years by a multidisciplinary team, including computer programming and systems experts, and personnel from both the Dean’s office and departments that oversee and are intimately familiar with the requirements. Users of all levels must also be included in the development process to achieve the necessary perspective and buy-in. All users anticipate significant long-term benefits.
To understand change, in general, is to understand the relationships between organizational issues, political issues, and technical issues.
In our society, political issues often drive change, in internal or external, and micro or macro sectors. The complex inner workings of academic medical schools and centers are at the micro political level, with a classic bureaucracy (formal jurisdictional areas, chain of accountability hierarchy, rules, records, dedicated training), on the basis of departments, seniority, research/publications, productivity and other factors. At the macro political level, government (State Department of Health or state or federal government) is required to be responsive to external pressures (the public, the press). In short, it is politics (micro and macro) that determines in what direction the ship will move. However, a vision for direction with no method of execution would be no vision at all.
Change requires an appreciation of how an organization (or a profession) functions. Most academic medical schools exist in divided territory, divided by specialty, seniority, primary role (researcher versus clinician), location (academic hospital versus ambulatory center), ability/willingness to teach, and place of training (15). These organizational aspects dictate interrelationships between individuals, groups, and external forces. Institutional history, local personalities, stakeholders/opinion leaders, local culture, and internal and external pressures also have an impact on the organization. Since change may involve gains/losses to individuals, groups and the institution, the dynamics of ownership, leadership, and power arise. Social systems preserve members’ interests and resist change. Fault lines in the quasi-stable environment are opportunities for change (e.g., a lack of concordance in members’ ideology, competition between subsystems).
It is important to understand an organization’s response to pressure and how it affects change. At the two extremes of change are public conformity and private acceptance. As Kelman describes, there are three ways the members can be encouraged to change: compliance, identification, and internalization (16). Compliance is the application of positive or negative incentives intended to direct behavior. Identification occurs when one member seeks to establish or maintain some relationship to another and lasts as long as the reason that initiated the change (15). Internalization is the acceptance of influence because the change is congruent with the member's existing value system, usually to reduce cognitive dissonance or because it is required by internal values (17).
Technical aspects of change can be altered through training/education, software, rebudgeting, and other interventions. Each organization must decide for itself what paddles to use and how to coordinate the rowing. Historically, choices in medical education were left to the rowers (departments, courses, or individual instructors) with resultant Brownian motion. Critical thinkers felt that, sitting at the level of the cresting waves, rowers lacked the perspective to chart a meaningful forward direction. Several parties have key roles, with political entities pointing the direction, while priorities and strategy were left to the organization to develop and plan, and to individuals to implement the technical aspects.
Change Initiated by Technology
In our two examples (ePortfolio and MyInfoVault), change occurred for several reasons. External pressure (Accreditation Council for Graduate Medical Education core competencies, Liaison Committee on Medical Education requirements for competency-based education, and University-imposed faculty standards) forced changes in the organization. For an institution to succeed with ePortfolio, all participants must see the relevance and receive training and prompts to use it. It must be integrated in other advising activities, for example, so adviser-advisee visits coincide with diary work. For those with vision, this is an opportunity to create a new dialogue, whereas for faculty it is a change, and for coordinators it is a significant logistical challenge. In the case of MyInfoVault it was not only the faculty member who needed to believe that a compilation of outcomes measures related to their performance was beneficial to their career, but the staff member who needed to see it would ultimately reduce their workload.
Academic medicine is under siege and needs new prototypes for managing educational, clinical and research visions. In research and clinical care, changes have been driven by reimbursement patterns, a drive to reduce error, and increased productivity—all of which depend upon clear documentation. To a large extent, technology has been an enabling force for much of this change, including the two complementary technology systems (ePortfolio and MyInfoVault) used in academic medicine to facilitate and measure outcomes, improve performance, and allow transparency. The implementation of both systems involved overcoming significant barriers and obstacles, but in the end, technology has become a part of the solution extending our armamentarium and facilitating information transfer.
FIGURE 1. Examples of Questions Posed to Students
Students receive questions specific to a course, their development as a doctor or on their learning process. Advisers may also be invited to add thoughts, which are used for discussion.
FIGURE 3. General Data Entry Screen From MyInfoVault
Note the large number of general categories. Data from any of these can be chosen to create the merit/promotion packet, CV or NIH data sketch. Additional categories can be created.
FIGURE 4. Data Entry Screen for Journal Publications
A similar format exists for other data categories such as research grants, and teaching activities.