Tod B. Sloan, MBA, MD, PhD, Celia I. Kaye, MD, PhD, William R. Allen, Brian E. Magness, and Steven A. Wartman, MD, PhD
Abstract
Changes in the education, research, and health care environments have had a major impact on the way in which medical schools fulfill their missions, and mission-based management approaches have been suggested to link the financial information of mission costs and revenues with measures of mission activity and productivity. The authors describe a simpler system, termed Mission-Aligned Planning (MAPTM), and its development and implementation, during fiscal years 2002 and 2003, at the School of Medicine at the University of Texas Health Science Center at San Antonio, Texas. The MAP system merges financial measures and activity measures to allow a broad understanding of the mission activities, to facilitate strategic planning at the school and departmental levels.
During the two fiscal years mentioned above, faculty of the school of medicine reported their annual hours spent in the four missions of teaching, research, clinical care, and administration and service in a survey designed by the faculty. A financial profit or loss in each mission was determined for each department by allocation of all departmental expenses and revenues to each mission. Faculty expenses (and related expenses) were allocated to the missions based on the percentage of faculty effort in each mission. This information was correlated with objective measures of mission activities.
The assessment of activity allowed a better understanding of the real costs of mission activities by linking salary costs, assumed to be related to faculty time, to the missions. This was a basis for strategic planning and for allocation of institutional resources.
Changes in the education, research, and health care environments have had a major impact on the way that medical schools fulfill their missions. Specifically, rising costs, including escalating salaries for faculty, coupled with declining clinical care reimbursement and shrinking educational support have presented substantial challenges, particularly regarding teaching. Without a regular assessment of resource allocation and faculty effort, the allocation of institutional resources can rapidly become misaligned with the activities and contributions of the faculty. It is therefore essential that methods be developed that permit objective assessments of resource allocation. Schools need a firm grasp on the efforts of their faculty and their allocated costs in order to plan strategically for the continued academic and financial success of the institution.
The Association of American Medical Colleges (AAMC) has promoted a methodology for this challenge known widely as mission-based management (MBM).1 In general, this involves the quantification of the activities of faculty in relation to the traditional missions of teaching, research, clinical care, and administration and service. MBM has also assessed the productivity of this activity and the costs associated with each mission. MBM systems have served as a tool for medical schools to optimize the alignment of institutional resources with both the existing activities of the faculty and new strategic initiatives. Many articles in the literature have shared various approaches and the associated results, which often involve the redistribution of resources based on the quantity and quality of faculty effort.1–15 Implicit is the recognition that historic methods of allocating funds do not match the faculty’s actual contributions to the missions. If data can be obtained that are derived from these actual contributions, then they can be used to “correct” the maldistribution of resources.
In addition, MBM has attempted to address the problem of defining the relative value of the productivity measures in each mission-related activity. Methods that have attempted to do so often result in a complex process that is difficult and/or expensive to administer. Further, such approaches may find it problematic to value nontraditional or novel methods of instruction. Additionally, academic faculty can find the assessment of productivity threatening. Nevertheless, using a system of weighted measures for faculty contributions can have the advantage of moving efforts more rapidly in the direction of desired change.
However, changes in the direction of the school’s missions may be possible using a system which is simpler, less expensive, and more acceptable to faculty than one that emphasizes productivity and the value of each mission-related activity. The faculty of the School of Medicine at the University of Texas Health Science Center at San Antonio (UTHSCSA) participated in the development of a simpler system, and its implementation over two fiscal years, FY2002 and FY2003, has provided the opportunity to assess its utility.
In this article, we describe that system and its development. It is a relatively simple method of mission-based management within a medical school, which focuses on assessing faculty activity in each of the missions and the associated revenues and costs. We have termed this process Mission-Aligned Planning (MAP™). Our goal was to gain the knowledge and insight necessary to guide the institution through its current challenges, change its direction in selected areas, and improve the operating margin of each department and the medical school.
Development of the System
The development of the MAP system, which began in 2001, was based on three criteria:
▪ that it involve a relatively simple method of assessing faculty effort and the cost of that effort, but make no attempt to weight the productivity or value of faculty effort in its primary data acquisition;
▪ that it prove to be useful at different administrative levels throughout the institution; and
▪ that it minimize cost by using as much available information and resources as possible.
We (the authors) developed the MAP system as a three-step process. After a series of departmental meetings organized by the dean and us to inform the faculty about the system’s goals and the institution’s commitment to the system, the first step was the enumeration of the faculty activities, which were measured by a survey of the faculty in each of the two academic years devoted to forming the database. Next, a process was developed to obtain the expenses and revenues relative to the school’s four missions of teaching, research, patient care, and administration and service. Administrative time was defined as activities within the institution (e.g., administrative positions or committee work) and was divided into activities within the department or hospital, for the medical school, and for the university. Service time was defined as time spent conducting activities outside the institution (e.g., national lectures, service on professional societies). Faculty development time (e.g., increasing administrative skills) was not differentiated between within and outside the institution.
This required that a consensus be achieved regarding specific conventions to be used for faculty activities that simultaneously apply to more than one mission (e.g., providing care for a patient while teaching with medical students and/or residents). Finally, all faculty activity and financial information data were merged into a unified financial format that would facilitate both analytic and strategic decision making. In order to protect the confidentiality of faculty members as well as to permit department chairs the opportunity to manage their departments effectively, each of the school’s 13 clinical departments received individual data on its own faculty only; the dean’s office received information aggregated at the department and school-wide levels. (Those departments are Anesthesiology, Family and Community Medicine, Internal Medicine, Obstetrics and Gynecology, Ophthalmology, Orthopaedics, Otolaryngology, Pediatrics, Psychiatry, Radiation Oncology, Radiology, Rehabilitation Medicine, and Surgery.) The basic science departments at the institution are part of a separate graduate school of biomedical sciences and were not part of the MAP system. Because financial information is most easily acquired for a full academic year, it was decided to use two consecutive academic years, FY2002 and FY2003 to form the initial MAP database. At UTHSCSA, the academic and fiscal years are coincident and run from September 1 through August 31. We will refer to years as fiscal years throughout this article.
The activity survey was developed through a faculty-led process designed to maximize faculty input and buy-in. The survey sought to identify all activities considered essential for carrying out the school’s missions. Initially, the MAP system was “beta tested” in a department that had high levels of activity in all mission areas. It quickly became evident that there were several activities that appeared to be unique to this department. As a result, the activity identification process was broadened by using workgroups within each of the clinical departments to incrementally expand the scope of the activity survey and include the unique aspects of each department. Early in the process, two departments piloted the survey instrument to gain specific feedback on the activities measured and the methodology. A final faculty consensus group representing all departments finished the survey instrument and assisted in developing the instruction set for the system’s administration.
The faculty consensus group also determined the amount of time to be allocated to a faculty activity in cases where the time spent would be either difficult to recall (lecture preparation) or where substantial variation would likely exist (e.g., the percent of time spent in clinical work and research activities that would be allocated to teaching). For example, the group decided to limit the time allocated for lecture preparation to an amount agreed upon, based on the type of lecture (new lectures, significant update of previous lectures, and minimal updates). And in those cases where teaching time was spent during research or clinical care activities, the group determined which portion of that time would be allocated to teaching or to the primary activity (research or clinical care). These conventions were revised for the second-year survey by data from the first year’s faculty survey.
The faculty activity survey took approximately one year to develop. The consensus group decided to allocate 50% of time to teaching when either clinical activity or research activity was being conducted in the presence of students and the faculty indicated that teaching was occurring. This was unchanged during the second year of the survey, as the initial survey results indicated a median estimate of 50% by the faculty. Lectures were categorized as either new or as major or minor updates of previous lectures. The consensus group elected to allocate lecture preparation times of 16, 8, and 2 hours, respectively, per hour of lecture. The first survey results indicated median lecture preparation times of 10.4, 3.7, and 1.3 hours per hour lecture; these medians were used during the second survey period.
Two other issues that needed resolution were the allocation of state-appropriated revenue and the allocation of faculty salary expenses. An administrative consensus group decided to allocate 80% of state-appropriated revenue to the teaching mission and 20% to administrative activities. This was based on the premise that these funds were intended by the state legislature to support the educational mission and that a portion was necessary to support the administrative infrastructure to facilitate teaching. Lengthy discussions occurred regarding this approach, notably whether some of these funds should be allocated to faculty support for start-up research activities. The decision to do so is an internal one, but the simplicity of the MAP methodology is such that this and other approaches can easily be changed and the resulting impact assessed. Since both education and administration had negative operating margins and required cross subsidization from the clinical and research missions, decreasing revenue to either of these would not change the overall picture.
Distribution of faculty salary expenses also deserves a note. First, it was agreed that there would be no distinction between missions when the cost of faculty time was assessed in the MAP system. An hour of teaching would be assigned the same cost (the actual cost of time for that faculty member) as an hour of clinical work or research. Second, it was decided to allocate salary expenses based on the percentage of faculty time spent in each of the four missions exclusive of clinical at-home calls. In the survey instrument, call time was defined as time away from the campus during which the faculty member was available by pager or phone for consultation. Call time that required the faculty to be “in-house” was included in clinical time.
The final survey instrument was distributed as an Excel spreadsheet and included a request for an estimate of the hours spent in mission-specific activities. Information was also requested regarding mission activities in specific geographic locations (e.g., different hospitals and clinics) and with different levels and types of trainees (e.g., medical students, graduate medical trainees, graduate students). Also requested were an estimate of time to complete the survey instrument, an estimate of lecture preparation time, and an estimate of the fraction of time during research or clinical activities that should be devoted to teaching. The spreadsheet compared the entered data to each faculty member’s estimate of the weekly hourly activity to assist in identifying areas of underreporting or overreporting. Faculty were asked to reexamine their entries when they exceeded maximum reasonable limits, and they were asked for corrected data when time entries were obviously in error. In order to make the survey most effective for mentoring, additional information such as detailed reports of products of academic work (e.g., papers published, presentations made) were also recorded.
The collection of activity data was conducted starting approximately three months following the completion of the fiscal year. Departments were asked to obtain survey information from all faculty, regardless of salary status, who participated in mission activities in excess of ten hours for the academic year. Each department strove for a 100% completion rate, as the departmental leadership wanted all of their mission activities to be recorded. The responsibility for having the faculty complete the surveys and then working with the faculty to correct survey entries was assumed by the respective departments. Therefore the departmental leadership had access to departmental data immediately following the submission of the data. Faculty rosters, the human resource database, and clinical care records were used to identify faculty missed in the initial survey replies. After the spreadsheets were completed, the results were collated into a departmental aggregate by an impartial intermediary (TS) who acted as both an institutional advocate and as a faculty advocate to insure the completeness and integrity of the process.
The aggregate data were verified with the department leadership before presentation to the dean. A similar process was followed during the second survey year except that the consensus groups used information learned in the first year’s survey process to make changes in the instructions to improve the second year’s process. The faculty consensus group also reevaluated the time allocated for lecture preparation as well as the fraction of clinical and research time allocated to teaching when teaching was being conducted in the context of clinical and research activities. This second survey requested the same data; only the instructions and spreadsheet format were changed slightly.
It was recognized that the data entered by faculty would be an imperfect recollection of exact time utilization; therefore, a large variety of objective activity measures was collected to corroborate the departmental aggregate activity data. For example, the Dean’s Office collected readily available information about the hours of lecture activity, measures of student rotations, and various other measures of educational activities conducted by departments. For the clinical mission, RVU (relative value unit) activity was collected. RVUs are a common scale developed by the Health Care Financing Administration (HCFA) and subsequently modified to quantify the work and resource costs needed to provide physician services across all fields of medicine.16 The HCFA system of RVUs has a physician component, a practice component and a malpractice component. For our analysis, only the physician component was used.
Grant funding data was collected for objective measures of the research mission. For administrative activity (exclusive of service activity), the number of department faculty was recorded. It must be emphasized that these “productivity” measures were not primary data elements in the survey; rather, they were elements assembled by the Dean’s Office to provide a “reality check” on the information from the surveys.
In addition, no effort was made to assign relative values to various educational, clinical, research, or scholarly activities. For the purposes of our analysis, we assumed that all activities of the faculty were valuable, and that it was the role of the chair to direct faculty to tasks that were most beneficial to the department and school. In essence our goal was to obtain a reasonable picture of the activity distribution across the missions. We recognized that a focus on “perfect data” would not only be costly but likely impossible (e.g., measuring actual clinical teaching time).
The second and third steps were the development and preparation of the unified financial spreadsheets for merging the activity data and the expenses and revenues within each of the departments. Table 1 shows the unified financial worksheet for a hypothetical department. The administrative consensus group of school leadership and departmental administrators also determined the conventions for the allocation of institutional funds to the various missions. With respect to revenues, the allocation of practice plan income to the clinical mission and of National Institutes of Health grant support to the research mission was straightforward. As mentioned above, each department’s state appropriations were allocated according to an 80/20 split, with 80% of state funding used to support the educational mission and 20% used to support the departmental administrative infrastructure. On the expense side, faculty salaries were allocated across the missions proportionate to each department’s overall faculty survey results. While the faculty survey results do not provide a precise reflection of the effort of departmental support staff, department chairs felt that the activities of their support staff generally occur in the mission areas of the faculty they support. The allocation of non-salary expenses into the four missions was accomplished through a set of expense allocation guidelines developed by the administrative consensus group.
The final presentation of information used for strategic planning included bar graphs created from these worksheets that demonstrated the positive or negative financial margins in each mission for each department. Additional information was gained by correlating various mission activity subsets with the respective expense and revenue amounts.
In order to correlate the objective teaching data with the MAP survey teaching hours, regression analysis was used. The total teaching hours recorded in the survey for all categories for each department were compared to the sum of the percentage contributions of that department in each objective teaching category. In order to determine the correlation of the data submitted in the FY2002 and FY2003 years, the hours in each category of mission activity, adjusted for the number of FTEs (full-time equivalents), were compared. A p value less than 0.05 was considered significant.
The First Two Years
Below we describe the results of using the MAP system during its first two years. Faculty surveys were completed three to four months after the end of each of the two fiscal years so that the financial information would match the recording of effort. In our description below, FY2002 refers to the fiscal and academic year of September 1, 2001, through August 31, 2002, and FY2003 refers to the fiscal and academic year of September 1, 2002, through August 31, 2003. During FY2002, 880 survey instruments were competed, representing 802.7 FTE; for FY2003, 987 survey instruments were completed, representing 892.8 FTE (the salaried faculty increased substantially by the time the second survey was made). Faculty indicated a median time for completing the survey instrument of 2.0 hours for FY2002 (with several faculty indicating substantially longer times to complete the survey). The median time for completing the survey instrument for FY2003 was 1.5 hours.
Table 2 shows the hours that all the school’s faculty reported, in the two surveys, toward fulfilling the various missions. Teaching hours shown included hours delivering lectures to students, non-lecture time (e.g., small group sessions, ward rounds), lecture preparation, teaching during clinical care (50% of the time when clinical care was being delivered and the faculty indicated that teaching was also occurring), teaching during research (50% of the time when teaching was being done during research activities), commuting between teaching at different locations, administrative time for teaching (e.g., organizing student rotations), and teaching development (e.g., taking courses on teaching). For the purposes of this report, teaching included hours spent with all students of the university: medical, dental, graduate, nursing, allied health, and all graduate medical trainees in programs sponsored by the institution (e.g., residents, fellows).
Also shown are the hours recorded in the clinical mission. These were the hours delivering direct clinical care (i.e., where the faculty were providing clinical care without teaching), delivering clinical care where teaching was occurring (50% of the clinical care time when teaching is occurring), commuting between clinical activity at different locations, other professional activities (e.g., dictating, record reviewing, legal work), faculty clinical development time (e.g., learning new clinical skills, continuing medical education courses) and hours spent “on call.” Time recorded for faculty who were required to remain in the hospital on call (e.g., obstetrics, anesthesiology) was recorded in direct clinical care. Call time outside the hospital was divided into “pager call” (where a faculty was not in the hospital, but must be available by pager to return, such as a surgeon on call), and “phone call” (where a faculty was not in the hospital, but must be available by phone for consultation). The time spent on “pager call” or “phone call” was not included in the time allocated to the clinical mission for the purposes of determining the allocation of faculty related expenses in the financial margin.
The hours recorded in the research mission were: time when the faculty was conducting research without teaching students, time when teaching was occurring (50% of the time was allocated to teaching and 50% to research), and time spent in faculty research development (e.g., learning new research skills). Time spent in research was not differentiated between the types of research (e.g., bench research, clinical research, population research).
Excluding pager and phone call, the distribution of the hours entered for FY2002 and FY2003 were 36.2% teaching in the first year (35.9% in the second year), 36.8% clinical work (36.4%), 16.9% research (17.7%), and 10.1% administration and service (10.0%). When the aggregate financial spreadsheet for the entire school is merged with the mission activities, the pattern of finances that emerges is shown in the financial margin bar chart in Figure 1. As noted by the negative margin, the cost of education far exceeded its allocation of funds. As expected, the clinical mission was the major activity for which revenues exceeded expenses, with the research mission also providing a positive margin (note that research revenues included nonfederal payers and endowments). Also as expected, the administration and service mission expenses exceeded revenues. Hence the cross-subsidization of the teaching mission and the administration and service mission occurred primarily from the clinical mission and, to a minor extent, the research mission.
The objective information collected for the missions was used to confirm and corroborate the activity measures that were recorded by the surveys. This was particularly important for the teaching mission, where independent measures of clinical teaching were often not available. Figure 2 is a bar plot by department, for the first survey year, of the percentage of the entire teaching load as measured by independent objective data on teaching activity in each of four categories (undergraduate medical education course directorships, medical school didactic course contact hours, third-year medical school clerkship faculty hours, fourthyear medical school student rotation hours) versus the total teaching hours for each department that were collected by the survey (i.e., MAP total teaching hours). The general concurrence of the first four measures with the MAP measure supports the belief that the reported survey hours reasonably reflected the relative contribution of each department to the overall teaching load. Using regression analysis, there was a significant correlation between aggregated objective measures of teaching and total teaching hours reported in MAP, with a p value of .011.
Bar charts similar to that shown in Figure 1 were constructed for individual departments and served to highlight issues that then formed the basis of departmental strategic decision making. Information gained during the first MAP program year (FY2002) was used to assist the school leadership in the allocation of institutional funds for FY2004 (September 1, 2003 through August 31, 2004). However, since the effects of the FY2004 allocations were not known prior to the start of FY2005, there were no reallocations based on the information from the second MAP program year (FY2003). As the effects of the FY2004 allocations were understood in the context of information gathered during the third MAP program year (FY2005), it is expected that a refinement in the allocation of institutional funds will occur in FY2006 or later.
We learned from this experience that the use of a full year of faculty survey and financial data, which must be collected and analyzed in the fiscal year following the year surveyed, results in a two-year delay between the year surveyed and implementation of changes in resource allocation. This delay could be reduced by utilizing a shorter survey period (e.g., first six months of the fiscal year), with analysis during the second half of the fiscal year and changes in allocation in the next fiscal year. Alternatively, since we observed little change in faculty activity between the two years, data from a previous year could be used to predict the subsequent year’s activity if no significant shift in activity has occurred. In future years, it is expected that insights derived from the MAP system will enable the school leadership to maintain a solid alignment of institutional funds with the missions and strategic plans of the school.
The emphasis in this analysis has been directed to the development of insights involving the net balance and crosssubsidization of school and departmental missions for the purpose of guiding strategic planning while improving overall financial performance. It is also a tool to assist the departments in moving in the direction of the vision of the school’s leadership. Obviously, this was particularly important where negative margins in key mission areas were of concern, or where the operating margin for the entire department was negative. A good example of this is shown in Figure 3 for a department that had substantial external support for the educational mission, and the overall departmental margin was negative. In addition, for this department the usual areas of positive margin (clinical and research) were also negative. The plan at the time of the first survey had been for the department to expand its clinical enterprise to improve the operating margin. However, the actual negative margin in clinical work suggested that simply expanding the existing clinical paradigm would worsen the margin. Instead the department was encouraged to reexamine each clinical activity and reduce those with negative margins to provide only those services that were necessary for the educational mission and expand in areas of positive clinical margin. In this case the analysis suggested a strategic change in clinical activity was needed, not a change in resource allocation. As seen in the second survey year, this philosophy was helpful.
A second example is seen in Figure 4, which shows a financial bar chart for a specific department. In this department, the slightly negative operating margin significantly worsened in the second year (FY2003). Our analysis suggested that the overall change was caused by worsened margins in all mission areas. Of particular interest was the change in the research mission. To understand this change, objective data and the survey data for all departments were merged (Figure 5). As shown, the average research support per hour of research activity was plotted versus average hours of activity recorded in the research mission in the survey for every department. As seen, two departments depart substantially from the rather linear relationship in the remainder of the departments. One department (Department Y) departs with relatively high funding per hour and the other (Department Z, the department in question) departs with low research support at the same time the hours devoted to research are high per FTE. This suggested that the department in question could improve its financial performance by increasing external research funding or decreasing time spent conducting underfunded research. Alternatively, some effort in research could be reallocated to provide revenue to the department by increasing clinical activity and income. Again, a strategic change in faculty effort was needed, not a reallocation of resources.
Why the System Works
The success of the implementation of the MAP system was due to several factors. First, the leadership of the medical school indicated its commitment to the system by visiting all departments to explain the system. Second, the development of the survey tool by the faculty allowed them to create a survey tool that would best reflect their efforts. As such, the final survey tool also included various measures of academic productivity that could be used for faculty mentoring and career guidance by the departmental leadership. Concerns about misuse of individual data by central administration were addressed by reporting only aggregate data to the leadership of the school. Hence, only the faculty and the departmental leadership had access to individual data. Finally, only faculty in clinical departments participated in this project, obviating the need for comparisons between clinical and basic science faculty. In this context, the faculty embraced the opportunity to develop the survey and willingly completed the tool.
Concerns have been raised that a reporting tool such as this may be misconstrued as being in conflict with other effort-reporting measures, such as reporting time on federally funded projects or contracted time (e.g., Veterans Administration salary eighths). Since the data were immediately available to the faculty and departmental leadership as the survey instruments were completed, the faculty had an opportunity to insure that the reporting was congruent among the various reports. This issue is complicated by the fact that reporting periods are not identical for different sources of funds, and definitions of time and effort differ. The school of medicine specifically asked faculty to compare their self-reported MAP data to federal effort reports to insure appropriate completion of all forms.
This method of mission-based analysis differs substantially from those described in the majority of the published articles on the topic. The fundamental difference is that our approach estimates the relative operating margin for each mission by determining the net cost of activity in the missions; the productivity and/or relative value of the activities contributing to each mission was not used as a primary measure. It should be noted that we placed no constraints on the data faculty could enter, unlike some approaches that force a limited number of hours worked per week.2,4 We also made relatively few assumptions regarding time allotments (i.e., limitation of lecture preparation time, the percentage of clinical and research time allocated to teaching when the faculty indicated that teaching was occurring with these activities). The cost of mission activities so measured includes all costs directly linked to mission activities as well as the part of faculty salaries allocated by the proportion of faculty time devoted to each mission. This allows us to focus directly for future planning on the largest expense and the most crucial resource, faculty time.2,3 The result is a more realistic assessment of the faculty (salary) cost of a mission. In addition, since each salary cost can in theory be directed to another mission, the opportunity cost of faculty activities can be estimated.
In contrast, a system that emphasizes productivity focuses resource allocation that is based on past strategic decisions and past performance. Such a system has been termed “backward looking” if future allocations are allocated strictly based on past strategic decisions and data. Such a system has the potential to drive the system to status quo unless sufficient incentives are built in to approach new activity targets. Further complicating this approach are the lags that exist between the time when the activity is measured, the time the resources are distributed, and the time the next assessment is made to determine the effect. In effect, by the time the data are collected from the past budget year, the next opportunity for resource reallocations is either midyear or the following fiscal year, often resulting in up to a two-year lag time. This built in delay makes decision making difficult.4
The approach presented here does not include valuations of particular activities based on past strategic decisions. As a result, the data are more straightforward, and strategic decisions regarding future resource allocation can be made more readily, based on an assessment of the actual cost of faculty effort as well as opportunity costs associated with particular activities. In this sense, the MAP approach is more forward-looking than other MBM approaches. However, the time delay associated with the use of a full fiscal year as the survey period led to a two-year delay between survey year and changes in resource allocation for the MAP system, as would be true with other MBM systems also. This long delay could be reduced by use of a shorter survey period, as noted previously.
A system of resource allocation based on productivity—“leadership by the numbers”5—also hampers strategic planning by the need to then follow the data.6 This often leads to the search for “perfect” data, an exercise that adds complexity while delaying decision making.7 This focus on data may distract from strategic thinking by impeding the ability to respond to strengths, weaknesses, opportunities and threats to the school’s missions. Hence our goal was to develop a less complex system that gathered meaningful information to facilitate basic and strategic decision making.
We achieved our goal of a simple, inexpensive system which can provide helpful insights in making strategic decisions. Given the relatively low cost of the MAP system, it is important to emphasize that the real work was distributed among faculty and administrators, who we believe had a vested interest in the project. The collection and correction of survey data were done by the departmental administrators. Actual data gathering and entry were tasked to individual faculty. The financial analyses were conducted by the office of the associate dean for finance of the medical school. Only one individual (TS) received direct salary support and that was a faculty member who acted as the intermediary between the departments and the medical school administration on a part-time basis. This person developed the data repository, managed the data collation, and provided the school leadership with the aggregated effort data that were merged with the financial information. He also worked with the department chairs and administrators to ensure the completeness and integrity of the data while assuring its confidentiality. As a result, the medical school administration was able to work confidently with aggregate departmental data while individual departments were able to manage their faculty as best suited the department’s mission, all in a complex and challenging environment where multiple missions must be served by faculty with widely differing individual interests and motivations.6,8,9 For some departments, initiatives to increase the clinical and/or research missions consistent with the school’s vision were developed with the understanding that they would eventually have positive margins. In others, where the operating margins for the clinical and research missions were negative (or where the overall balance was negative), the data were examined for insights that could help guide the department leadership.
Clearly, the real cost of the MAP system is the faculty time and effort in the survey completion and related administrative costs. Fortunately, the time to complete the survey decreased in the second year, suggesting that faculty were more familiar with the instrument. Other than the part-time individual mentioned above, there were no recurring fixed costs. Hence, the costs to use survey data in subsequent years are minimized. The similarity of the data between the two survey years suggests that the frequency of subsequent surveys can be decreased unless major shifts in faculty number, activity, or school financing occur. Other approaches have focused on assessing productivity for medical school missions, and proposals for relative weighting of these products have been published.1,2,7,10 –13 Each of these approaches requires that a value judgment be derived that can be used to strategically drive the productivity of the faculty if reallocation of faculty activity is anticipated. However, previous publications have indicated that academic faculty are generally not accustomed to having their activity measured in this way.9 Therefore a method that surveys time utilization may be less threatening than one focused on productivity or the relative values of various activities. While the assessment of productivity (and any associated incentive plans) can be effective in improving positive margins and providing funds for missions with negative margins, this approach can be threatening for the teaching mission or the administrative and service mission, where increased productivity may not increase a positive margin or even contribute to a meaningful expansion. Further, measuring and valuing teaching conducted during the clinical and research missions is quite difficult, as objective measures are hard to come by. Research may be difficult to assess, since productivity is cyclical and has a built-in lag time.14 When RVUs are used as the basis for clinical incentive systems,2,10 these units do not allow assessment of the impact or quality of the interaction, especially if teaching is occurring during the clinical care. Similarly, it is difficult to assess the impact or quality of teaching in a large-group format versus more personalized teaching to a small group or an individual.3,7
The MAP system presented here is useful independent of the issue of quality, since it assesses only the cost of the activity. This cost of activity assessment also allows estimation of the opportunity costs when certain activities are reduced in favor of another, or when a new activity is introduced and the costs and revenues can be estimated. We agree that quality issues must be considered, but independent quality indicators can be used to weigh the various mission activities in light of their costs and operating margins.
Mission-based management, as promoted by the AAMC, has been defined as “a process for organizational decision making that is mission driven, ensures internal accountability, distributes resources in alignment with organization-wide goals, and is based on timely, open, and accurate information.”8 The methodology we present allows calculated “strategic” decision making, since the direction of movement is not predetermined and the estimated costs of strategic initiatives can be developed. As challenges or opportunities present themselves, the impacts of shifts in activities can be estimated because the relative costs and impacts on the operating margins can be approximated.
One potential drawback of the method presented here is the necessity to use a survey instrument. Previous publications have noted the value of faculty survey techniques in budgeting and manpower planning.2 The survey method, when linked to productivity measures, has the inherent disadvantage of potential abuse, especially when known incentives cause the faculty to inflate the time entered.2,8,15 Since the current system does not link directly to productivity, the potential for abuse is reduced. Also, the independent collection of activity indicators from available school data allows confirmation of the validity of the survey’s findings. If, as in our case, the survey instrument is developed by the faculty themselves, faculty anxiety is reduced.8,13,14 Further, since no weighting of the value of activities occurs with this method, the faculty have more confidence that the data will not be manipulated. The only manipulation in this system is through the use of conventions for lecture preparation and for the fraction of teaching credited when clinical and research activities are occurring simultaneously. Fortunately, these were set by the faculty conventions group and updated based on the findings of the initial faculty survey. Other survey-based methods have reported poor faculty participation in the survey.9 However, since each departmental administrator had a vested interest in ensuring that the full departmental contribution to the missions was recorded, we achieved full participation.
Survey methodology can also be criticized because of the inaccuracies inherent in the memory of faculty regarding their activities over the past year. In our school, the survey results were remarkably unchanged in hours logged per FTE when the two survey years were compared, suggesting that the data are likely representative of the distribution of faculty effort. When the aggregate data from FY2002 and FY2003 are compared on hours per FTE in each category of reporting, the p value is .0001. Consistent with this, a large number of faculty resubmitted the data from FY2002 for the FY2003 year. However, if the accuracy of data is to be confirmed, an independent system for recording activity (e.g., lectures) would be needed. This would clearly increase the complexity (and cost) of the methodology and likely still fall short of measuring the teaching associated with clinical and research activities. The survey has the advantage of establishing time utilization patterns that can be used for mentoring and strategic planning of faculty members’ careers. For example, the extent to which an individual’s survey deviates from the expected can serve as a focus for this planning.
With respect to the survey being based on a faculty member’s recollection of his or her time utilization, it is essential to use corroborating objective information if decisions are made regarding resource allocation. A variety of information sources are generally available for the clinical and research missions. However, key objective measures may need to be derived for the teaching mission. At our institution a variety of measures of teaching effort (e.g., shown in Figure 4) were readily available and were used to corroborate the relative distribution of teaching effort.
One useful aspect of this approach is that the data can be used for any subset of the entire dataset (e.g., by school, departmental, division, or individual faculty member). Thus the data are helpful for the school to assess its overall operating margins, for departments to review their contributions to the missions, and for individual faculty to consider their progress towards promotion and tenure. Two departments utilized the survey to analyze clinical productivity (based on RVUs) and the faculty time recorded in the survey. This was used to assist faculty in focusing their efforts and balancing their academic time. A third department utilized the faculty survey by comparing the effort recorded to the acquisition of grant funding and publication of scholarly manuscripts in order to assess accountability for academic research time. Finally, the data may be useful to provide reports of accountability and activity to external stakeholders (such as time and effort reporting).
A second useful attribute of this method is that the simplicity of the operating margin analysis allows assessment of the impact of different assumptions on conclusions. For example, as mentioned above, state appropriations were divided between the teaching and administrative missions. It would be straightforward to assess the impact of changing the 80:20 ratio for this, or the inclusion of some of these funds to stimulate research. Similarly, it is not difficult to assess the impact of apportioning a different proportion of clinical time to teaching. In our original analysis, we allocated 50% of clinical time to teaching if both activities were occurring simultaneously. As shown in Figure 6, we recalculated the operating margin for the school of medicine by assuming all clinical time was applied to the clinical mission, and none to teaching. Comparing Figure 6 with Figure 1, which utilized a 50% allocation of clinical time to teaching if teaching of students was occurring, the operating margin is improved in the teaching mission and decreased in the clinical mission, reflecting the impact of faculty (and related) expenses on the clinical mission. This difference suggests the degree of cross-subsidization from the clinical mission that is needed to support the teaching mission.
In conclusion, over the two years during which the MAP system has been used, we have gained much useful insight into the budgetary challenges faced by the medical school. The data have been useful in the allocation of state resources to incrementally correct maldistributions caused by historical methods that no longer reflect actual contributions to the school’s activities or their desired strategic directions. As a simple, inexpensive tool, it has been easily integrated into the budgeting and planning process and has served to inform strategic decision making and resource allocation.
This article was originally published in the November 2005 issue of Academic Medicine.
References
1 Nutter DO, Bond JS, Coller BS, et al
Measuring faculty effort and contributions
in medical education. Acad Med. 2000;75
199–207.
2 Daugird AJ, Arndt JE, Olson PR. A
computerized faculty time-management
system in an Academic family medicine
department. Acad Med. 2003;78:129–36.
3 Watson RT, Romrell LJ. Mission-based
budgeting: removing a graveyard. Acad Med.
1999;74:627–40.
4 Whitcomb ME. Mission-based management
and the improvement of medical students’
education. Acad Med. 2002;77:113–14.
5 Howell LP, Hogarth MA, Anders TF.
Implementing a mission-based reporting
system at an Academic health center: a
method for mission enhancement. Acad Med.
2003;78:645–51.
6 Ridley GT, Skochelak SF, Farrell PM.
Mission-aligned management and allocation:
a successfully implemented model of
mission-based budgeting. Acad Med. 2002;
77:124–29.
7 Mallon WT, Jones RF. How do medical
schools use measurement systems to track
faculty activity and productivity in teaching?
Acad Med. 2002;77:115–23.
8 Brigham EJ, Tellers CA, Rondinelli R.
Academic survival through mission-based
management. Am J Phys Med Rehabil. 2001;
80:778–85.
9 Garson A, Strifert KE, Beck JR, et al The
metrics process: Baylor’s development of a
“report card” for faculty and departments.
Acad Med. 1999;74:861–70.
10 Cramer JS, Ramalingam S, Rosenthal TC, Fox
CH. Implementing a comprehensive relativevalue-based incentive plan in an academic family medicine department. Acad Med. 2002;75:1159–66.
11 Bardes CL, Hayes JG. Are the teachers
teaching? Measuring the educational activities
of clinical faculty. Acad Med. 1995;70:
111–14.
12 Hilton C, Fisher Jr, W, Lopez A, Sanders C. A
relative-value-based system for calculating
faculty productivity in teaching, research,
administration, and patient care. Acad Med.
1997;72:787–93.
13 Coleman DL, Moran E, Serfilippi D, et al
Measuring physicians’ productivity in a
Veterans’ Affairs Medical Center. Acad Med.
2003;78:682–89.
14 Howell LP, Hogarth M, Anders TF. Creating
a mission-based reporting system at an
Academic health center. Acad Med. 2002;77:
130–38.
15 Ruedy J, MacDonald NE, MacDougall B.
Ten-year experience with mission-based
budgeting in the faculty of medicine of
Dalhousie University. Acad Med. 2003; 78:
1121–29.
16 Johnson SE, Newton WP. Resource-based
relative value units: a primer for academic
family physicians. Fam Med. 2002; 34:
172–6.