Using Strategic Planning to Envision Your Future

AAMC Presentation on Strategy, Development, and Advancement -distribution copy

David S. Hefner, MPA Executive Vice President for Clinical Affairs Chief Executive Officer for Georgia Health Sciences Medical Center & Medical Associates Georgia Health Sciences University

Susan Barcus Senior Vice President for Advancement & Community Relations Chief Development Officer Georgia Health Sciences University

Loader Loading…
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Download

The Impact of the 2008 Economic Recession on U.S. Medical Schools and Related Organizations

Editorial Manager(tm) for Academic Medicine Manuscript Draft

Title: Impact of the 2008 Economic Recession on U.S. Medical Schools and Related

Organizations Article Type: Research Report Corresponding Author: Dr. Jack Krakower, Ph.D.


Corresponding Author’s Institution: AAMC First Author: Jack Krakower, Ph.D.


Order of Authors: Jack Krakower, Ph.D.; Margarette C Goodwin, ?; Heather Sacks, ?; David Hefner, ?


Manuscript Region of Origin: UNITED STATES

ABSTRACT

The near collapse of the U.S. financial system in 2008 had broad impact that did not spare the U.S. health care system, including medical schools and teaching hospitals. Two years later, the media is still filled with reports about schools and hospitals that are struggling to survive. In order to obtain insight into the impact of the financial crisis on academic medical centers, AAMC staff conducted interviews with the leadership of medical schools that purported to experience material losses in funds supporting their organizations.

This article describes the magnitude of the losses reported by participating schools and the factors that should be considered in interpreting their impact. It describes the actions and strategies that schools adopted to address financial stress and suggests steps that might be taken by other schools that find themselves in similar circumstances. It concludes with observations by some interviewees regarding the use of the crisis to bring about needed changes that would not have been possible without a “burning platform.”

An added commentary by a health system president warns against tactical approaches adopted by most schools that fail to address systemic changes required to assure the long-term health of academic medical centers.

BACKGROUND

The collapse of the investment sector in 2008 had widespread national and international impact. Not unlike other economic and health sectors, the fall-out included material losses in critical sources that support medical schools and teaching hospitals in the United States News media began reporting the impact of the losses in the fall of 2008, and by early 2009, furloughs and staff layoffs were reported at universities, medical schools and teaching hospitals across the country.1

The Association of American Medical Colleges (AAMC) determined it would be useful to investigate the targeted impact of the crisis on U.S. medical schools and steps taken to address the situation. This paper summarizes the causes, impacts, and steps some institutions have taken to address problems related to the financial crisis.

In July 2009, AAMC staff contacted Group on Business Affairs (GBA) Principal Business Officers (PBOs) asking those at schools experiencing material revenue reductions attributable to the crisis to participate in a brief phone interview. PBOs from 23 schools agreed to participate in a 30-minute interview. A copy of the interview protocol is included as an appendix to this document.

SOURCE AND MAGNITUDE OF LOSSES

There is no simple way to characterize the magnitude of losses experienced by schools in the study, albeit the sources are generally predictable. The magnitude of losses reported by participating PBOs ranged from those that were described as imminent but never materialized to losses ranging from under $1 million to more than $80 million, with anticipated losses in one school approaching $200 million over the next three years.

To understand the impact of losses at a given school, one needs to understand the funding source(s) affected, intended use(s) of the lost funds, the magnitude of the loss, and the ability of the institution to cover such losses from other funding sources. At some schools, FY09 losses were defined as funds allocated or appropriated that were subsequently unavailable for spending (i.e., budget reductions), whereas other schools defined losses as the difference between budget projections and funds available for spending (i.e., revenue shortfalls). In order to understand the impact of the size/magnitude of a given loss, one also needs to compare current year revenues to prior year revenues for the source in question.

In the descriptions that follow, ―losses‖ include both budget reductions and revenue shortfalls. In some cases, losses may have been covered from alternative fund sources such that existing commitments (e.g., faculty salaries) and planned commitments (e.g., program expansions) were ultimately met.

As might be predicted, reductions in state support was the most common source of material losses for public schools (and even a handful of private schools). Schools, particularly private institutions that relied heavily on investment earnings (e.g., from endowments, quasi-endowments, unendowed reserves and working capital) to support current operations reported experiencing material losses. 1 While media reports have suggested that losses at teaching hospital have been widespread, the majority of PBOs in our sample reported that their affiliated hospitals expected to break-even or end the year with a positive bottom-line.2 Even more surprising to us, the majority of PBOs reported that clinical practice income was at least

However, this strategy was not available to all schools, with several relying heavily on investment income to support current operations. In many of these cases, endowments had been established to support the creation of new faculty positions and research centers. Faced with reduced earnings, the schools found themselves in the position of having to either underwrite these commitments from other fund sources, and/or materially cut other expenses in order to fund these commitments.

Most PBOs with whom we spoke reported losses in no more than two sources – most often state support and investment earnings. Although gift revenues were impacted at many schools, these tend to be relatively small in scope. Furthermore, because gifts are often restricted by the donor, medical schools were not reliant on these funds for day-to-day operations.

Several schools experienced significant losses from multiple sources and found themselves facing dire circumstances due to commitments previously made based on revenue assumptions. These commitments included recruiting new faculty to grow the research enterprise, construction of research buildings and critical, but marginally revenue producing faculty, for clinical programs. This scenario occurred in both relatively small medical schools as well as some of the largest schools in the country. In these circumstances, the coupling of financial losses from multiple sources with the costs of significant new commitments served to create the ―perfect storm.

One of the common refrains we heard from both public and private schools was that the medical school was often viewed by the parent university as the ―fat cat‖ school that could bear a larger share of the budget cuts. It’s easy to envision how those who do not understand the nature of medical school financing might come to this conclusion. The median total revenue for public medical schools in FY 2008 was $424 million; the median for private schools was $651 million. The relatively large size of total revenues characteristic of most medical schools sometimes leads to the erroneous conclusion that cuts of a few million dollars or additional parent taxes on the medical school have relatively little impact on operations of the medical school.4 However, funds available for discretionary spending are but a small fraction of total revenues.5

The principal sources of seemingly ―discretionary funds that support the operations of medical schools are state support (in the case of public schools), unrestricted gifts, endowment earnings, indirect cost recoveries, and tuition. Only a handful of schools also have access to patent income.

Although ―discretionary fund sources represent a fraction of the budget for most medical schools, more often than not, they are the primary source of funding for medical education and medical school administration as illustrated by the circumstances facing one school that participated in the study. In FY08, this school’s total budget was nearly a billion dollars, but state support to the medical school has historically been among the lowest in the country. At the onset of the crisis, state support to the medical school was reduced by one-third. Although the reduction amounted to little more than one-half percent of the school’s total budget, state funds were the primary source of support for medical education, tenured faculty salaries, student services (e.g., financial aid), and medical school administration (e.g., admissions). In this case, the school was able to divert dean’s development funds to cover some of the loss but was unable to cover the entire loss, thus resulting in cutbacks in services and staff.

On the one hand, this example illustrates why it is difficult to characterize the relationship between the magnitude of losses and their effect on a given medical school. On the other hand, we did find patterns between the source of funds cut and the impact. Because state funds are generally used to cover expenses related to teaching and administrative infrastructure, reductions in state funds typically required medical schools to reduce administrative staff in the dean’s office and in departments. The impact on educational programs was less clear because schools took a variety of steps to preserve educational programs as discussed in the next section.

As previously mentioned, some schools relied on earnings from endowments and quasi-endowments to fund key aspects of their current operations. Schools that relied on this funding model experienced material shortfalls due to the loss of investment income.6 These were most often research-intensive schools that made commitments on investment income to hire new faculty and construct new facilities needed to expand their research mission. The loss of investment income meant not only that they could not continue to grow research, but that they were challenged to meet commitments made to new faculty and to fund debt service on new facilities.

While this study focused on reductions in revenues, one PBO pointed out that the other side of the equation is material increases in expenses, particularly expenses associated with compliance, quality assurance, and technology. The school in question had experienced cost increases that exceeded 15% annually for the last six years – all without off-setting sources of revenues.

STEPS TAKEN TO ADDRESS LOSSES

Steps taken to address budget reductions and revenues shortfalls can be characterized as falling into two categories – tactical and strategic. The two approaches are distinguished in terms of the former encompassing short term steps aimed at staunching losses and shoring-up the enterprise whereas the latter approach encompasses longer term actions that seemed to be driven by a more holistic approach to dealing with financial problems. 

While a strategic orientation may have guided the steps schools took to address the budget crisis, most of the actions taken were described in more tactical terms. It should be noted that the degree to which a particular medical school’s leadership participated in making decisions about how to approach the budget crisis varied across schools, so that medical schools that might have otherwise chosen a strategic approach were sometimes partially constrained by tactical steps mandated by the parent university.

The tactical steps offer no surprises – reductions in discretionary expenditures (e.g., travel), hiring and salary freezes, and administrative staff furloughs and layoffs. It was rare that furloughs or layoffs applied to faculty, and when layoffs were required, the dean’s office generally took a proportionately greater share of the burden. Budget reductions were frequently administered across-the-board with the dean’s office and departments sharing the burden; however, at some schools, reductions were distributed across departments based on ―ability to pay.

Other tactical steps taken to address budget reductions and revenue shortfalls included:

 Consolidating basic sciences departments
 Consolidating administrative functions across departments (e.g., IT, grants
management)
 Consolidating administrative functions between the medical school, clinical practice,
and an affiliated hospital (e.g., IT, budgeting, cashier, safety, human resources,
credentialing)
 Increasing tuition and student fees
 Delaying capital projects, thus preserving cash
 Deferring building maintenance
 Changing benefits plans, including retirement contributions
 Eliminating bonuses and position reclassifications
 Implementing clinical performance incentives and clinical workload standards
 Changing faculty compensation – limiting guaranteed salary for tenured faculty
 Developing new clinical service lines in order to increase patient volumes and reduce
expenses
 Consolidating faculty clinical practice plans
 Delaying implementation of strategic priorities (e.g., increasing class size, electronic
medical records)
 Implementing plans to consume less energy
 Implementing ―green initiatives‖ to reduce waste
 Closing regional physician offices and clinics
 Implementing voluntary retirement programs or buy-outs for senior faculty
 Outsourcing services, such as printing
 Delaying chair recruitment and giving greater consideration to internal candidates
 Developing new programs to generate tuition revenues (e.g., post-baccalaureate
certificates)
 Enforcing policies on space utilization and density
 Developing partnerships with affiliate hospitals that result in profit sharing
 Draining reserves?

A number of schools reported that protecting the educational mission influenced their decisions, though generally not in the context of an overall institutional strategy for dealing with losses. Only three of the 23 schools interviewed reported steps the institution took in the context of an over-arching strategic plan. Even when schools adopted a strategic approach, many (if not most) of the steps they took were identical to those of schools whose approach was tactical. However, in these schools, the decision-making process included identifying steps needed to achieve both short-term and long-term organizational goals.

It should be noted that those schools who adopted a strategic approach faced very significant budget reductions and/or revenue shortfalls—in the tens of millions of dollars, but not all schools who faced significant losses adopted a strategic approach. In some cases, the magnitude of budget reductions and revenue shortfalls was both sudden and unanticipated; there was simply no time to develop a process for strategic decision-making.

As described by the PBO at one school that adopted a longer term strategic approach: ―Our overall intention was to change the culture of how people operate at the institution and to have their response be long term and sustainable, even as times got better.‖ This school’s plan included four components. First and foremost, they considered the feasibility of developing and implementing ―more systematic business practices‖ as the ―keystone‖ to long-term cost reductions. As a part of their overall strategy, the medical school partnered with their affiliate hospital in the planning process. The dean and hospital CEO committed to working collaboratively to make the organization a ―high performing, highly reliable organization‖ and to strive for ―resource maximization.‖ The result of their efforts included eliminating and recombining redundant hospital and medical school support/administrative services in areas including purchasing, housekeeping, and capital planning. Other steps included cuts in discretionary spending, educating faculty and staff about the costs of running the school (―reduce, reuse, and re-think‖), taking steps to reduce energy consumption, and reaching out to faculty and staff for ideas to improve efficiency. It should be noted that this was accomplished in the absence of a reporting structure that required the dean and hospital CEO to work together!

LESSONS LEARNED

PBOs who participated in the interviews were asked to describe what they learned from their experience that might benefit other schools facing similar circumstances. Their advice can be organized into three categories: 1) communication, 2) processes, and 3) crisis as an opportunity.

Nearly all of the PBOs with whom we spoke talked about the importance of frequent, honest, and open communications as critical to dealing with the situations that they faced. They spoke about the importance of regular meetings between the dean, chairs, where possible, health system personnel, and ―town hall meetings with faculty and staff. The mechanism for communicating with students, residents, and postdoctoral trainees was not always clear, but many PBOs said that budget news updates were frequently posted on their websites. Even though communication was identified as paramount, significant shortfalls surfaced. Among these was that communications between the dean’s office and chairs did not always filter down to faculty. Equally problematic, messages conveyed by chairs (and sometimes the dean’s office) were sometimes inconsistent.

The second category relates to how the process was managed and how the process materially influenced the success/failure in dealing with the situations they were facing. Advice and observations included:

 Chairs & faculty almost reflexively denied the circumstances facing the institution,
taking a position that the institution could ―ride out the storm.‖
 Even when chairs were involved in deciding how to cut expenses, it was not
uncommon for them to take a position that the cutbacks should not apply to their
department.
 The dean’s office did not always have transparent knowledge of departmental fund
balances.
 Having a consolidated budget process that includes the medical school and the
health system is essential to decision-making.
 If layoffs are necessary, effect them all at once and in unison.
 It is essential to align spending with priorities.
 Small decisions can cause huge ripples.
 Faculty should be directly involved in the earliest stage of the decision-making
process, especially if the magnitude of cuts involves reducing faculty salaries,
departmental reorganization, or other actions that affect them.
 Students should be engaged early in the process.
 Celebrate your successes!

The third category of lessons learned has to do with the notion that a crisis may create opportunities to change the organization in ways that are necessary but might otherwise be extremely difficult to implement. It can be characterized by the expression, ―Never let a serious crisis go to waste (Rahm Emanuel).‖ As noted above, several schools used the crisis they faced as the opportunity to consolidate departments and administrative functions, reallocate resources across departments, restructure compensation, and implement productivity measures and standards. In addition, the crisis provided some organizations with the basis for strengthening the partnership between the medical school, the clinical practice, and affiliated hospitals.

CONCLUDING OBSERVATIONS

The impact of the financial crisis on schools that participated in the study ranged from imminent, devastating cuts that did not materialize to situations that involved material budget and program cuts, staff furloughs, and layoffs. We note, parenthetically, that we did not hear of one instance where a faculty member was laid-off as a consequence of financial stress. Recent conversations with PBOs suggest that many schools continue to operate in a psychological state that is akin to ―waiting for the ax to fall,‖ or living on an earthquake fault-line – even though the financial crisis has yet to seriously impact their organization.  Decision-making and infrastructure conditions that led to the circumstances facing schools seem to cluster into four categories: (1) revenue projections built on the assumption that ―past performance is indicative of future results,‖ particularly with respect to assumptions about the continued growth of the research enterprise and indirect cost recoveries and the growth in endowments and endowment earnings; (2) resource commitments made based on the assumption that increasing investments, particularly in the research enterprise, would result in increased revenues; (3) lack of management systems and accountability mechanisms at both the dean’s office and department levels that hindered the ability of schools to react quickly; (4) dependence on state/parent funds as the sole source of support for education and administrative infrastructure.

Steps taken to address the financial crisis seemed to fall into two overlapping categories: tactical and strategic – with the latter category encompassing responses found in the former category, but including a set of over-arching goals and communications processes that focused on bringing about changes to the entire organization. The crisis forced some schools to confront weak accounting, reporting and management systems, and provided a rationale for introducing greater accountability into their systems. It also obliged some schools to identify and remove artificial political barriers that resulted in duplicative and inefficient systems.

Shrinking resources, particularly with regard to endowment earnings and state budget reductions, forced many schools to turn to short-term tactical solutions such as hiring freezes, layoffs and furloughs. Even as the national economy shows signs of recovery, the impact of financial crisis on medical schools, particularly those who rely heavily on state funds, will likely extend well into FY2011.

Commentary: “Fair Warning” – A Health System Executive’s Perspective

Those of you who have attended auctions may recall that the auctioneer’s cry of ―fair warning‖ to signal bidders that they have one last chance to change the course of events before the gavel falls and the bidding ends. Reading the draft report of this study compels me to offer my own version of ―fair warning‖ to my medical school colleagues. I offer this in hopes that we will jointly recognize and come to grips with the failed notion of incremental reactions as a substitute for an overarching strategic and complimentary tactical multi-year approach for fundamentally redesigning our operating models. My advice is informed by serving as the senior executive of two teaching hospitals and practice plans, as well as serving as a consultant to the leadership of more than seventy-five medical schools and AMCs over the last twenty-five years.

With the near-economic collapse of the financial industry, the rapid decreases of revenues and investment income, and the concomitant recession we are in the midst of, most care delivery institutions restored their margins by dramatically reducing their cost base (see previous reference to the COTH quarterly report).


The financing and feedback loops of hospitals and health delivery enterprises are crystal clear when compared to medical schools. All hospitals are obliged to produce balance sheets and understandable profit/loss statements. Hospitals that fail to generate a sufficient margin, let alone break-even, are subject to severe treatment by external entities such as bond rating agencies and creditors. Hence, the strategic planning and financial modeling processes of hospitals tend to have built-in upside and downside scenarios that enable their leadership to prescribe precipitous actions that preserve the long-term viability of the enterprise.

The financing of medical schools is, at best, characterized as ―one off. While they are commonly supported by only a handful of fund sources (e.g., clinical practice revenues, grants and contracts, hospital support, university overhead, philanthropy, etc.), the interplay of these dollars and the administrative infrastructures vary dramatically. It could be argued that hospitals, unlike their medical school counterparts, are not subject to the same labor constraints with respect to cutting costs — e.g., faculty tenure and due process. However, the external visibilities and stresses of unions, the media scrutiny, and the serving the safety net needs of their communities are similar constraints and trade-offs that lead me to believe we all have our constraints to wrestle with.

A historical and somewhat analogous version of ―fair warning‖ comes from the food preservation industry. In the 1920s, the first patents for refrigerators, then described as a ―refrigerating machine for cooling and preserving foods at home,‖ were issued. The dominate ice cutters’ union in Minnesota dismissed the new fangled machines as dangerous contraptions. Furthermore, they were certain that, at best, all they needed to do in response was to learn how to cut ice more efficiently.

As I read about or witness first-hand the tactical rather than strategic approaches taken by our medical schools, I cannot help but conclude that we are focusing on cutting the ice more efficiently. Taking steps to protect the status quo ignores reality. Unfortunately, there is evidence to suggest that this is not the first time that our schools have chosen to believe that what is happening around us is not applicable. The study done by Heinig, Dickler, Korn, and Krakower in 2007 demonstrated that while the NIH had made it clear that research funding would be flat (at best) in the next 3-5 years, almost all schools in the country reported their expected research portfolios would grow by 3%-5% per year for the same period.7 We see the same gold rush mentality as it relates to the temporary ARRA stimulus funding that wears off in the next 12 months – – have we not just increased our research base and consumption that will inevitably lead to heightened research shortfalls?

I do not for a moment believe that we are out of the economic woods and can continue to operate as we have. If we do not press forward with substantive and systemic change, the likes we have not experienced for 40 years, then we will indeed experience the bang of the hammer and have missed our own industry’s ―fair warning”.

1) Unendowed reserves and working capital include funds invested for the short and intermediate term – investment vehicles like the Common Fund. Their earnings were typically driven by Treasury bill rates which were below 0.2% for most of 2009, and were below 0.1% for the last quarter of the year.
2) A comparison of the COTH data reflecting margins for the fourth quarter of 2008 to the second quarter of 2009 lends supports to PBOs comments. That is, the median margin went from –1.3% to +8.1% during this period.
3) According to data collected on the FY 2009LCME Part I-A Annual Financial Questionnaire, the average change in practice plan revenue was about 6%.
4) The medians obscure huge differences between schools – total revenues in FY08 ranged from more than $1.5 billion to less than $10 million. See http://www.aamc.org/data/finance/. 5 Almost 40% of the funds supporting medical school activities are not directly controlled by the leadership of medical schools. Most often, these revenues are related to expenses of affiliated hospitals or clinical practice operations that are “offthe-books” of the medical school. Furthermore, funds associated with sponsored grants and contracts, which represent nearly 30% of total medical school revenues, are “restricted” by sponsors to a specific purpose, as are most gifts.
6) It is important to recognize that endowments exist because donors elect to fund certain programs. In general, the funds are restricted to specific uses and cannot be set aside and used for one-time expenses. If, for example, a given endowment is established to fund new faculty positions or a research center, once established the school is generally obliged to support the activity even if the endowment earnings fall short. Consequently, it would be a mistake to characterize “relying on endowment earnings” as a “choice.” Comment by George Andersson, CFO, Washington University School of Medicine.
7) Stephen J. Heinig, M.A., Jack Y. Krakower, Ph.D., Howard B. Dickler, M.D., David Korn, M.D. Sustaining the Engine of U.S. Biomedical Discovery. NEJM, September 6, 2007, p. 1042-1047.

APPENDIX

INTERVIEW QUESTIONS

I. Nature of crisis:

1) Cause(s) of stress including reductions in

  •  state support
  •  endowment earnings 
  • gifts
  • clinical practice income
  • hospital support
  • other fund sources

2) $ Magnitude

3) Previous use of funds (e.g., recruit faculty, financial aid, support/underwrite research)

II. Steps taken to address crisis:

1) Staff actions (e.g., furloughs, lay-offs, hiring freezes)

2) Revenue/Expenditure actions (e.g., increase tuition, limit discretionary spending, reduce salaries, eliminate raises, delay capital projects)

3) Restructuring/programmatic actions (e.g., combine departments, change curriculum, reorganize administrative functions)

4) Decision process

5) Who was involved (e.g., dean, FPP director, chairs, hospital, parent university)

6) communication strategy

7) Complicating factors (e.g., unions, tenure, commitments, partners, contracts)

8) Timeframe — near and longer-term actions

III. Outcomes and Unanticipated Consequences

IV. Advice/Lessons for others

1) major challenges

2) roadblocks

3) miss-steps

4) facilitating conditions

5) crisis as an opportunity to drive change

1 See, for example, Sickinger T. Downturn Cripples OHSU Staff. Oregonlive.com. December 2, 2008. http://www.oregonlive.com/business/oregonian/index.ssf?/base/business/122819191672940.xml&c oll=7

Loyola University Health System Cutting More Than 440 Jobs. Chicagotribune.com. May 12, 2009. http://www.chicagotribune.com/topic/wgnam-loyola-090512,0,1204730.story

Gallagher K. Medical College of Wisconsin Will Cut It’s Budget by 5%. Milwaukee Journal Sentinel. March 21, 2009. http://www.jsonline.com/business/41608377.html

Jan T. Harvard to Lay Off 275. Boston.com June 23, 2009. http://www.boston.com/news/local/breaking_news/2009/06/harvard_u_to_la.html

The UW System Furlough Guidelines. University of Wisconsin System. http://www.uwsa.edu/furloughs/

Furlough Information. University of Wisconsin Department of Medicine. http://www2.medicine.wisc.edu/home/hr/furlough

Implementing a Simpler Approach to Mission-Based Planning in a Medical School

Tod B. Sloan, MBA, MD, PhD, Celia I. Kaye, MD, PhD, William R. Allen, Brian E. Magness, and Steven A. Wartman, MD, PhD

Abstract

Changes in the education, research, and health care environments have had a major impact on the way in which medical schools fulfill their missions, and mission-based management approaches have been suggested to link the financial information of mission costs and revenues with measures of mission activity and productivity. The authors describe a simpler system, termed Mission-Aligned Planning (MAPTM), and its development and implementation, during fiscal years 2002 and 2003, at the School of Medicine at the University of Texas Health Science Center at San Antonio, Texas. The MAP system merges financial measures and activity measures to allow a broad understanding of the mission activities, to facilitate strategic planning at the school and departmental levels.

During the two fiscal years mentioned above, faculty of the school of medicine reported their annual hours spent in the four missions of teaching, research, clinical care, and administration and service in a survey designed by the faculty. A financial profit or loss in each mission was determined for each department by allocation of all departmental expenses and revenues to each mission. Faculty expenses (and related expenses) were allocated to the missions based on the percentage of faculty effort in each mission. This information was correlated with objective measures of mission activities.

The assessment of activity allowed a better understanding of the real costs of mission activities by linking salary costs, assumed to be related to faculty time, to the missions. This was a basis for strategic planning and for allocation of institutional resources.

Changes in the education, research, and health care environments have had a major impact on the way that medical schools fulfill their missions. Specifically, rising costs, including escalating salaries for faculty, coupled with declining clinical care reimbursement and shrinking educational support have presented substantial challenges, particularly regarding teaching. Without a regular assessment of resource allocation and faculty effort, the allocation of institutional resources can rapidly become misaligned with the activities and contributions of the faculty. It is therefore essential that methods be developed that permit objective assessments of resource allocation. Schools need a firm grasp on the efforts of their faculty and their allocated costs in order to plan strategically for the continued academic and financial success of the institution.

The Association of American Medical Colleges (AAMC) has promoted a methodology for this challenge known widely as mission-based management (MBM).1 In general, this involves the quantification of the activities of faculty in relation to the traditional missions of teaching, research, clinical care, and administration and service. MBM has also assessed the productivity of this activity and the costs associated with each mission. MBM systems have served as a tool for medical schools to optimize the alignment of institutional resources with both the existing activities of the faculty and new strategic initiatives. Many articles in the literature have shared various approaches and the associated results, which often involve the redistribution of resources based on the quantity and quality of faculty effort.1–15 Implicit is the recognition that historic methods of allocating funds do not match the faculty’s actual contributions to the missions. If data can be obtained that are derived from these actual contributions, then they can be used to “correct” the maldistribution of resources.

In addition, MBM has attempted to address the problem of defining the relative value of the productivity measures in each mission-related activity. Methods that have attempted to do so often result in a complex process that is difficult and/or expensive to administer. Further, such approaches may find it problematic to value nontraditional or novel methods of instruction. Additionally, academic faculty can find the assessment of productivity threatening. Nevertheless, using a system of weighted measures for faculty contributions can have the advantage of moving efforts more rapidly in the direction of desired change.

However, changes in the direction of the school’s missions may be possible using a system which is simpler, less expensive, and more acceptable to faculty than one that emphasizes productivity and the value of each mission-related activity. The faculty of the School of Medicine at the University of Texas Health Science Center at San Antonio (UTHSCSA) participated in the development of a simpler system, and its implementation over two fiscal years, FY2002 and FY2003, has provided the opportunity to assess its utility.

In this article, we describe that system and its development. It is a relatively simple method of mission-based management within a medical school, which focuses on assessing faculty activity in each of the missions and the associated revenues and costs. We have termed this process Mission-Aligned Planning (MAP™). Our goal was to gain the knowledge and insight necessary to guide the institution through its current challenges, change its direction in selected areas, and improve the operating margin of each department and the medical school.

Development of the System

The development of the MAP system, which began in 2001, was based on three criteria:

▪ that it involve a relatively simple method of assessing faculty effort and the cost of that effort, but make no attempt to weight the productivity or value of faculty effort in its primary data acquisition;

▪ that it prove to be useful at different administrative levels throughout the institution; and

▪ that it minimize cost by using as much available information and resources as possible.

We (the authors) developed the MAP system as a three-step process. After a series of departmental meetings organized by the dean and us to inform the faculty about the system’s goals and the institution’s commitment to the system, the first step was the enumeration of the faculty activities, which were measured by a survey of the faculty in each of the two academic years devoted to forming the database. Next, a process was developed to obtain the expenses and revenues relative to the school’s four missions of teaching, research, patient care, and administration and service. Administrative time was defined as activities within the institution (e.g., administrative positions or committee work) and was divided into activities within the department or hospital, for the medical school, and for the university. Service time was defined as time spent conducting activities outside the institution (e.g., national lectures, service on professional societies). Faculty development time (e.g., increasing administrative skills) was not differentiated between within and outside the institution.

This required that a consensus be achieved regarding specific conventions to be used for faculty activities that simultaneously apply to more than one mission (e.g., providing care for a patient while teaching with medical students and/or residents). Finally, all faculty activity and financial information data were merged into a unified financial format that would facilitate both analytic and strategic decision making. In order to protect the confidentiality of faculty members as well as to permit department chairs the opportunity to manage their departments effectively, each of the school’s 13 clinical departments received individual data on its own faculty only; the dean’s office received information aggregated at the department and school-wide levels. (Those departments are Anesthesiology, Family and Community Medicine, Internal Medicine, Obstetrics and Gynecology, Ophthalmology, Orthopaedics, Otolaryngology, Pediatrics, Psychiatry, Radiation Oncology, Radiology, Rehabilitation Medicine, and Surgery.) The basic science departments at the institution are part of a separate graduate school of biomedical sciences and were not part of the MAP system. Because financial information is most easily acquired for a full academic year, it was decided to use two consecutive academic years, FY2002 and FY2003 to form the initial MAP database. At UTHSCSA, the academic and fiscal years are coincident and run from September 1 through August 31. We will refer to years as fiscal years throughout this article.

The activity survey was developed through a faculty-led process designed to maximize faculty input and buy-in. The survey sought to identify all activities considered essential for carrying out the school’s missions. Initially, the MAP system was “beta tested” in a department that had high levels of activity in all mission areas. It quickly became evident that there were several activities that appeared to be unique to this department. As a result, the activity identification process was broadened by using workgroups within each of the clinical departments to incrementally expand the scope of the activity survey and include the unique aspects of each department. Early in the process, two departments piloted the survey instrument to gain specific feedback on the activities measured and the methodology. A final faculty consensus group representing all departments finished the survey instrument and assisted in developing the instruction set for the system’s administration.

The faculty consensus group also determined the amount of time to be allocated to a faculty activity in cases where the time spent would be either difficult to recall (lecture preparation) or where substantial variation would likely exist (e.g., the percent of time spent in clinical work and research activities that would be allocated to teaching). For example, the group decided to limit the time allocated for lecture preparation to an amount agreed upon, based on the type of lecture (new lectures, significant update of previous lectures, and minimal updates). And in those cases where teaching time was spent during research or clinical care activities, the group determined which portion of that time would be allocated to teaching or to the primary activity (research or clinical care). These conventions were revised for the second-year survey by data from the first year’s faculty survey.

The faculty activity survey took approximately one year to develop. The consensus group decided to allocate 50% of time to teaching when either clinical activity or research activity was being conducted in the presence of students and the faculty indicated that teaching was occurring. This was unchanged during the second year of the survey, as the initial survey results indicated a median estimate of 50% by the faculty. Lectures were categorized as either new or as major or minor updates of previous lectures. The consensus group elected to allocate lecture preparation times of 16, 8, and 2 hours, respectively, per hour of lecture. The first survey results indicated median lecture preparation times of 10.4, 3.7, and 1.3 hours per hour lecture; these medians were used during the second survey period.

Two other issues that needed resolution were the allocation of state-appropriated revenue and the allocation of faculty salary expenses. An administrative consensus group decided to allocate 80% of state-appropriated revenue to the teaching mission and 20% to administrative activities. This was based on the premise that these funds were intended by the state legislature to support the educational mission and that a portion was necessary to support the administrative infrastructure to facilitate teaching. Lengthy discussions occurred regarding this approach, notably whether some of these funds should be allocated to faculty support for start-up research activities. The decision to do so is an internal one, but the simplicity of the MAP methodology is such that this and other approaches can easily be changed and the resulting impact assessed. Since both education and administration had negative operating margins and required cross subsidization from the clinical and research missions, decreasing revenue to either of these would not change the overall picture.

Distribution of faculty salary expenses also deserves a note. First, it was agreed that there would be no distinction between missions when the cost of faculty time was assessed in the MAP system. An hour of teaching would be assigned the same cost (the actual cost of time for that faculty member) as an hour of clinical work or research. Second, it was decided to allocate salary expenses based on the percentage of faculty time spent in each of the four missions exclusive of clinical at-home calls. In the survey instrument, call time was defined as time away from the campus during which the faculty member was available by pager or phone for consultation. Call time that required the faculty to be “in-house” was included in clinical time.

The final survey instrument was distributed as an Excel spreadsheet and included a request for an estimate of the hours spent in mission-specific activities. Information was also requested regarding mission activities in specific geographic locations (e.g., different hospitals and clinics) and with different levels and types of trainees (e.g., medical students, graduate medical trainees, graduate students). Also requested were an estimate of time to complete the survey instrument, an estimate of lecture preparation time, and an estimate of the fraction of time during research or clinical activities that should be devoted to teaching. The spreadsheet compared the entered data to each faculty member’s estimate of the weekly hourly activity to assist in identifying areas of underreporting or overreporting. Faculty were asked to reexamine their entries when they exceeded maximum reasonable limits, and they were asked for corrected data when time entries were obviously in error. In order to make the survey most effective for mentoring, additional information such as detailed reports of products of academic work (e.g., papers published, presentations made) were also recorded.

The collection of activity data was conducted starting approximately three months following the completion of the fiscal year. Departments were asked to obtain survey information from all faculty, regardless of salary status, who participated in mission activities in excess of ten hours for the academic year. Each department strove for a 100% completion rate, as the departmental leadership wanted all of their mission activities to be recorded. The responsibility for having the faculty complete the surveys and then working with the faculty to correct survey entries was assumed by the respective departments. Therefore the departmental leadership had access to departmental data immediately following the submission of the data. Faculty rosters, the human resource database, and clinical care records were used to identify faculty missed in the initial survey replies. After the spreadsheets were completed, the results were collated into a departmental aggregate by an impartial intermediary (TS) who acted as both an institutional advocate and as a faculty advocate to insure the completeness and integrity of the process.

The aggregate data were verified with the department leadership before presentation to the dean. A similar process was followed during the second survey year except that the consensus groups used information learned in the first year’s survey process to make changes in the instructions to improve the second year’s process. The faculty consensus group also reevaluated the time allocated for lecture preparation as well as the fraction of clinical and research time allocated to teaching when teaching was being conducted in the context of clinical and research activities. This second survey requested the same data; only the instructions and spreadsheet format were changed slightly.

It was recognized that the data entered by faculty would be an imperfect recollection of exact time utilization; therefore, a large variety of objective activity measures was collected to corroborate the departmental aggregate activity data. For example, the Dean’s Office collected readily available information about the hours of lecture activity, measures of student rotations, and various other measures of educational activities conducted by departments. For the clinical mission, RVU (relative value unit) activity was collected. RVUs are a common scale developed by the Health Care Financing Administration (HCFA) and subsequently modified to quantify the work and resource costs needed to provide physician services across all fields of medicine.16 The HCFA system of RVUs has a physician component, a practice component and a malpractice component. For our analysis, only the physician component was used.

Grant funding data was collected for objective measures of the research mission. For administrative activity (exclusive of service activity), the number of department faculty was recorded. It must be emphasized that these “productivity” measures were not primary data elements in the survey; rather, they were elements assembled by the Dean’s Office to provide a “reality check” on the information from the surveys.

In addition, no effort was made to assign relative values to various educational, clinical, research, or scholarly activities. For the purposes of our analysis, we assumed that all activities of the faculty were valuable, and that it was the role of the chair to direct faculty to tasks that were most beneficial to the department and school. In essence our goal was to obtain a reasonable picture of the activity distribution across the missions. We recognized that a focus on “perfect data” would not only be costly but likely impossible (e.g., measuring actual clinical teaching time).

The second and third steps were the development and preparation of the unified financial spreadsheets for merging the activity data and the expenses and revenues within each of the departments. Table 1 shows the unified financial worksheet for a hypothetical department. The administrative consensus group of school leadership and departmental administrators also determined the conventions for the allocation of institutional funds to the various missions. With respect to revenues, the allocation of practice plan income to the clinical mission and of National Institutes of Health grant support to the research mission was straightforward. As mentioned above, each department’s state appropriations were allocated according to an 80/20 split, with 80% of state funding used to support the educational mission and 20% used to support the departmental administrative infrastructure. On the expense side, faculty salaries were allocated across the missions proportionate to each department’s overall faculty survey results. While the faculty survey results do not provide a precise reflection of the effort of departmental support staff, department chairs felt that the activities of their support staff generally occur in the mission areas of the faculty they support. The allocation of non-salary expenses into the four missions was accomplished through a set of expense allocation guidelines developed by the administrative consensus group.

The final presentation of information used for strategic planning included bar graphs created from these worksheets that demonstrated the positive or negative financial margins in each mission for each department. Additional information was gained by correlating various mission activity subsets with the respective expense and revenue amounts.

In order to correlate the objective teaching data with the MAP survey teaching hours, regression analysis was used. The total teaching hours recorded in the survey for all categories for each department were compared to the sum of the percentage contributions of that department in each objective teaching category. In order to determine the correlation of the data submitted in the FY2002 and FY2003 years, the hours in each category of mission activity, adjusted for the number of FTEs (full-time equivalents), were compared. A p value less than 0.05 was considered significant.

The First Two Years

Below we describe the results of using the MAP system during its first two years. Faculty surveys were completed three to four months after the end of each of the two fiscal years so that the financial information would match the recording of effort. In our description below, FY2002 refers to the fiscal and academic year of September 1, 2001, through August 31, 2002, and FY2003 refers to the fiscal and academic year of September 1, 2002, through August 31, 2003. During FY2002, 880 survey instruments were competed, representing 802.7 FTE; for FY2003, 987 survey instruments were completed, representing 892.8 FTE (the salaried faculty increased substantially by the time the second survey was made). Faculty indicated a median time for completing the survey instrument of 2.0 hours for FY2002 (with several faculty indicating substantially longer times to complete the survey). The median time for completing the survey instrument for FY2003 was 1.5 hours.

Table 2 shows the hours that all the school’s faculty reported, in the two surveys, toward fulfilling the various missions. Teaching hours shown included hours delivering lectures to students, non-lecture time (e.g., small group sessions, ward rounds), lecture preparation, teaching during clinical care (50% of the time when clinical care was being delivered and the faculty indicated that teaching was also occurring), teaching during research (50% of the time when teaching was being done during research activities), commuting between teaching at different locations, administrative time for teaching (e.g., organizing student rotations), and teaching development (e.g., taking courses on teaching). For the purposes of this report, teaching included hours spent with all students of the university: medical, dental, graduate, nursing, allied health, and all graduate medical trainees in programs sponsored by the institution (e.g., residents, fellows).

Also shown are the hours recorded in the clinical mission. These were the hours delivering direct clinical care (i.e., where the faculty were providing clinical care without teaching), delivering clinical care where teaching was occurring (50% of the clinical care time when teaching is occurring), commuting between clinical activity at different locations, other professional activities (e.g., dictating, record reviewing, legal work), faculty clinical development time (e.g., learning new clinical skills, continuing medical education courses) and hours spent “on call.” Time recorded for faculty who were required to remain in the hospital on call (e.g., obstetrics, anesthesiology) was recorded in direct clinical care. Call time outside the hospital was divided into “pager call” (where a faculty was not in the hospital, but must be available by pager to return, such as a surgeon on call), and “phone call” (where a faculty was not in the hospital, but must be available by phone for consultation). The time spent on “pager call” or “phone call” was not included in the time allocated to the clinical mission for the purposes of determining the allocation of faculty related expenses in the financial margin.

The hours recorded in the research mission were: time when the faculty was conducting research without teaching students, time when teaching was occurring (50% of the time was allocated to teaching and 50% to research), and time spent in faculty research development (e.g., learning new research skills). Time spent in research was not differentiated between the types of research (e.g., bench research, clinical research, population research).

Excluding pager and phone call, the distribution of the hours entered for FY2002 and FY2003 were 36.2% teaching in the first year (35.9% in the second year), 36.8% clinical work (36.4%), 16.9% research (17.7%), and 10.1% administration and service (10.0%). When the aggregate financial spreadsheet for the entire school is merged with the mission activities, the pattern of finances that emerges is shown in the financial margin bar chart in Figure 1. As noted by the negative margin, the cost of education far exceeded its allocation of funds. As expected, the clinical mission was the major activity for which revenues exceeded expenses, with the research mission also providing a positive margin (note that research revenues included nonfederal payers and endowments). Also as expected, the administration and service mission expenses exceeded revenues. Hence the cross-subsidization of the teaching mission and the administration and service mission occurred primarily from the clinical mission and, to a minor extent, the research mission.

The objective information collected for the missions was used to confirm and corroborate the activity measures that were recorded by the surveys. This was particularly important for the teaching mission, where independent measures of clinical teaching were often not available. Figure 2 is a bar plot by department, for the first survey year, of the percentage of the entire teaching load as measured by independent objective data on teaching activity in each of four categories (undergraduate medical education course directorships, medical school didactic course contact hours, third-year medical school clerkship faculty hours, fourthyear medical school student rotation hours) versus the total teaching hours for each department that were collected by the survey (i.e., MAP total teaching hours). The general concurrence of the first four measures with the MAP measure supports the belief that the reported survey hours reasonably reflected the relative contribution of each department to the overall teaching load. Using regression analysis, there was a significant correlation between aggregated objective measures of teaching and total teaching hours reported in MAP, with a p value of .011.

Bar charts similar to that shown in Figure 1 were constructed for individual departments and served to highlight issues that then formed the basis of departmental strategic decision making. Information gained during the first MAP program year (FY2002) was used to assist the school leadership in the allocation of institutional funds for FY2004 (September 1, 2003 through August 31, 2004). However, since the effects of the FY2004 allocations were not known prior to the start of FY2005, there were no reallocations based on the information from the second MAP program year (FY2003). As the effects of the FY2004 allocations were understood in the context of information gathered during the third MAP program year (FY2005), it is expected that a refinement in the allocation of institutional funds will occur in FY2006 or later.

We learned from this experience that the use of a full year of faculty survey and financial data, which must be collected and analyzed in the fiscal year following the year surveyed, results in a two-year delay between the year surveyed and implementation of changes in resource allocation. This delay could be reduced by utilizing a shorter survey period (e.g., first six months of the fiscal year), with analysis during the second half of the fiscal year and changes in allocation in the next fiscal year. Alternatively, since we observed little change in faculty activity between the two years, data from a previous year could be used to predict the subsequent year’s activity if no significant shift in activity has occurred. In future years, it is expected that insights derived from the MAP system will enable the school leadership to maintain a solid alignment of institutional funds with the missions and strategic plans of the school.

The emphasis in this analysis has been directed to the development of insights involving the net balance and crosssubsidization of school and departmental missions for the purpose of guiding strategic planning while improving overall financial performance. It is also a tool to assist the departments in moving in the direction of the vision of the school’s leadership. Obviously, this was particularly important where negative margins in key mission areas were of concern, or where the operating margin for the entire department was negative. A good example of this is shown in Figure 3 for a department that had substantial external support for the educational mission, and the overall departmental margin was negative. In addition, for this department the usual areas of positive margin (clinical and research) were also negative. The plan at the time of the first survey had been for the department to expand its clinical enterprise to improve the operating margin. However, the actual negative margin in clinical work suggested that simply expanding the existing clinical paradigm would worsen the margin. Instead the department was encouraged to reexamine each clinical activity and reduce those with negative margins to provide only those services that were necessary for the educational mission and expand in areas of positive clinical margin. In this case the analysis suggested a strategic change in clinical activity was needed, not a change in resource allocation. As seen in the second survey year, this philosophy was helpful.

A second example is seen in Figure 4, which shows a financial bar chart for a specific department. In this department, the slightly negative operating margin significantly worsened in the second year (FY2003). Our analysis suggested that the overall change was caused by worsened margins in all mission areas. Of particular interest was the change in the research mission. To understand this change, objective data and the survey data for all departments were merged (Figure 5). As shown, the average research support per hour of research activity was plotted versus average hours of activity recorded in the research mission in the survey for every department. As seen, two departments depart substantially from the rather linear relationship in the remainder of the departments. One department (Department Y) departs with relatively high funding per hour and the other (Department Z, the department in question) departs with low research support at the same time the hours devoted to research are high per FTE. This suggested that the department in question could improve its financial performance by increasing external research funding or decreasing time spent conducting underfunded research. Alternatively, some effort in research could be reallocated to provide revenue to the department by increasing clinical activity and income. Again, a strategic change in faculty effort was needed, not a reallocation of resources.

Why the System Works

The success of the implementation of the MAP system was due to several factors. First, the leadership of the medical school indicated its commitment to the system by visiting all departments to explain the system. Second, the development of the survey tool by the faculty allowed them to create a survey tool that would best reflect their efforts. As such, the final survey tool also included various measures of academic productivity that could be used for faculty mentoring and career guidance by the departmental leadership. Concerns about misuse of individual data by central administration were addressed by reporting only aggregate data to the leadership of the school. Hence, only the faculty and the departmental leadership had access to individual data. Finally, only faculty in clinical departments participated in this project, obviating the need for comparisons between clinical and basic science faculty. In this context, the faculty embraced the opportunity to develop the survey and willingly completed the tool.

Concerns have been raised that a reporting tool such as this may be misconstrued as being in conflict with other effort-reporting measures, such as reporting time on federally funded projects or contracted time (e.g., Veterans Administration salary eighths). Since the data were immediately available to the faculty and departmental leadership as the survey instruments were completed, the faculty had an opportunity to insure that the reporting was congruent among the various reports. This issue is complicated by the fact that reporting periods are not identical for different sources of funds, and definitions of time and effort differ. The school of medicine specifically asked faculty to compare their self-reported MAP data to federal effort reports to insure appropriate completion of all forms.

This method of mission-based analysis differs substantially from those described in the majority of the published articles on the topic. The fundamental difference is that our approach estimates the relative operating margin for each mission by determining the net cost of activity in the missions; the productivity and/or relative value of the activities contributing to each mission was not used as a primary measure. It should be noted that we placed no constraints on the data faculty could enter, unlike some approaches that force a limited number of hours worked per week.2,4 We also made relatively few assumptions regarding time allotments (i.e., limitation of lecture preparation time, the percentage of clinical and research time allocated to teaching when the faculty indicated that teaching was occurring with these activities). The cost of mission activities so measured includes all costs directly linked to mission activities as well as the part of faculty salaries allocated by the proportion of faculty time devoted to each mission. This allows us to focus directly for future planning on the largest expense and the most crucial resource, faculty time.2,3 The result is a more realistic assessment of the faculty (salary) cost of a mission. In addition, since each salary cost can in theory be directed to another mission, the opportunity cost of faculty activities can be estimated. 

In contrast, a system that emphasizes productivity focuses resource allocation that is based on past strategic decisions and past performance. Such a system has been termed “backward looking” if future allocations are allocated strictly based on past strategic decisions and data. Such a system has the potential to drive the system to status quo unless sufficient incentives are built in to approach new activity targets. Further complicating this approach are the lags that exist between the time when the activity is measured, the time the resources are distributed, and the time the next assessment is made to determine the effect. In effect, by the time the data are collected from the past budget year, the next opportunity for resource reallocations is either midyear or the following fiscal year, often resulting in up to a two-year lag time. This built in delay makes decision making difficult.4

The approach presented here does not include valuations of particular activities based on past strategic decisions. As a result, the data are more straightforward, and strategic decisions regarding future resource allocation can be made more readily, based on an assessment of the actual cost of faculty effort as well as opportunity costs associated with particular activities. In this sense, the MAP approach is more forward-looking than other MBM approaches. However, the time delay associated with the use of a full fiscal year as the survey period led to a two-year delay between survey year and changes in resource allocation for the MAP system, as would be true with other MBM systems also. This long delay could be reduced by use of a shorter survey period, as noted previously.

A system of resource allocation based on productivity—“leadership by the numbers”5—also hampers strategic planning by the need to then follow the data.6 This often leads to the search for “perfect” data, an exercise that adds complexity while delaying decision making.7 This focus on data may distract from strategic thinking by impeding the ability to respond to strengths, weaknesses, opportunities and threats to the school’s missions. Hence our goal was to develop a less complex system that gathered meaningful information to facilitate basic and strategic decision making.

We achieved our goal of a simple, inexpensive system which can provide helpful insights in making strategic decisions. Given the relatively low cost of the MAP system, it is important to emphasize that the real work was distributed among faculty and administrators, who we believe had a vested interest in the project. The collection and correction of survey data were done by the departmental administrators. Actual data gathering and entry were tasked to individual faculty. The financial analyses were conducted by the office of the associate dean for finance of the medical school. Only one individual (TS) received direct salary support and that was a faculty member who acted as the intermediary between the departments and the medical school administration on a part-time basis. This person developed the data repository, managed the data collation, and provided the school leadership with the aggregated effort data that were merged with the financial information. He also worked with the department chairs and administrators to ensure the completeness and integrity of the data while assuring its confidentiality. As a result, the medical school administration was able to work confidently with aggregate departmental data while individual departments were able to manage their faculty as best suited the department’s mission, all in a complex and challenging environment where multiple missions must be served by faculty with widely differing individual interests and motivations.6,8,9 For some departments, initiatives to increase the clinical and/or research missions consistent with the school’s vision were developed with the understanding that they would eventually have positive margins. In others, where the operating margins for the clinical and research missions were negative (or where the overall balance was negative), the data were examined for insights that could help guide the department leadership.

Clearly, the real cost of the MAP system is the faculty time and effort in the survey completion and related administrative costs. Fortunately, the time to complete the survey decreased in the second year, suggesting that faculty were more familiar with the instrument. Other than the part-time individual mentioned above, there were no recurring fixed costs. Hence, the costs to use survey data in subsequent years are minimized. The similarity of the data between the two survey years suggests that the frequency of subsequent surveys can be decreased unless major shifts in faculty number, activity, or school financing occur. Other approaches have focused on assessing productivity for medical school missions, and proposals for relative weighting of these products have been published.1,2,7,10 –13 Each of these approaches requires that a value judgment be derived that can be used to strategically drive the productivity of the faculty if reallocation of faculty activity is anticipated. However, previous publications have indicated that academic faculty are generally not accustomed to having their activity measured in this way.9 Therefore a method that surveys time utilization may be less threatening than one focused on productivity or the relative values of various activities. While the assessment of productivity (and any associated incentive plans) can be effective in improving positive margins and providing funds for missions with negative margins, this approach can be threatening for the teaching mission or the administrative and service mission, where increased productivity may not increase a positive margin or even contribute to a meaningful expansion. Further, measuring and valuing teaching conducted during the clinical and research missions is quite difficult, as objective measures are hard to come by. Research may be difficult to assess, since productivity is cyclical and has a built-in lag time.14 When RVUs are used as the basis for clinical incentive systems,2,10 these units do not allow assessment of the impact or quality of the interaction, especially if teaching is occurring during the clinical care. Similarly, it is difficult to assess the impact or quality of teaching in a large-group format versus more personalized teaching to a small group or an individual.3,7

The MAP system presented here is useful independent of the issue of quality, since it assesses only the cost of the activity. This cost of activity assessment also allows estimation of the opportunity costs when certain activities are reduced in favor of another, or when a new activity is introduced and the costs and revenues can be estimated. We agree that quality issues must be considered, but independent quality indicators can be used to weigh the various mission activities in light of their costs and operating margins.

Mission-based management, as promoted by the AAMC, has been defined as “a process for organizational decision making that is mission driven, ensures internal accountability, distributes resources in alignment with organization-wide goals, and is based on timely, open, and accurate information.”8 The methodology we present allows calculated “strategic” decision making, since the direction of movement is not predetermined and the estimated costs of strategic initiatives can be developed. As challenges or opportunities present themselves, the impacts of shifts in activities can be estimated because the relative costs and impacts on the operating margins can be approximated.

One potential drawback of the method presented here is the necessity to use a survey instrument. Previous publications have noted the value of faculty survey techniques in budgeting and manpower planning.2 The survey method, when linked to productivity measures, has the inherent disadvantage of potential abuse, especially when known incentives cause the faculty to inflate the time entered.2,8,15 Since the current system does not link directly to productivity, the potential for abuse is reduced. Also, the independent collection of activity indicators from available school data allows confirmation of the validity of the survey’s findings. If, as in our case, the survey instrument is developed by the faculty themselves, faculty anxiety is reduced.8,13,14 Further, since no weighting of the value of activities occurs with this method, the faculty have more confidence that the data will not be manipulated. The only manipulation in this system is through the use of conventions for lecture preparation and for the fraction of teaching credited when clinical and research activities are occurring simultaneously. Fortunately, these were set by the faculty conventions group and updated based on the findings of the initial faculty survey. Other survey-based methods have reported poor faculty participation in the survey.9 However, since each departmental administrator had a vested interest in ensuring that the full departmental contribution to the missions was recorded, we achieved full participation.

Survey methodology can also be criticized because of the inaccuracies inherent in the memory of faculty regarding their activities over the past year. In our school, the survey results were remarkably unchanged in hours logged per FTE when the two survey years were compared, suggesting that the data are likely representative of the distribution of faculty effort. When the aggregate data from FY2002 and FY2003 are compared on hours per FTE in each category of reporting, the p value is .0001. Consistent with this, a large number of faculty resubmitted the data from FY2002 for the FY2003 year. However, if the accuracy of data is to be confirmed, an independent system for recording activity (e.g., lectures) would be needed. This would clearly increase the complexity (and cost) of the methodology and likely still fall short of measuring the teaching associated with clinical and research activities. The survey has the advantage of establishing time utilization patterns that can be used for mentoring and strategic planning of faculty members’ careers. For example, the extent to which an individual’s survey deviates from the expected can serve as a focus for this planning.

With respect to the survey being based on a faculty member’s recollection of his or her time utilization, it is essential to use corroborating objective information if decisions are made regarding resource allocation. A variety of information sources are generally available for the clinical and research missions. However, key objective measures may need to be derived for the teaching mission. At our institution a variety of measures of teaching effort (e.g., shown in Figure 4) were readily available and were used to corroborate the relative distribution of teaching effort.

One useful aspect of this approach is that the data can be used for any subset of the entire dataset (e.g., by school, departmental, division, or individual faculty member). Thus the data are helpful for the school to assess its overall operating margins, for departments to review their contributions to the missions, and for individual faculty to consider their progress towards promotion and tenure. Two departments utilized the survey to analyze clinical productivity (based on RVUs) and the faculty time recorded in the survey. This was used to assist faculty in focusing their efforts and balancing their academic time. A third department utilized the faculty survey by comparing the effort recorded to the acquisition of grant funding and publication of scholarly manuscripts in order to assess accountability for academic research time. Finally, the data may be useful to provide reports of accountability and activity to external stakeholders (such as time and effort reporting).

A second useful attribute of this method is that the simplicity of the operating margin analysis allows assessment of the impact of different assumptions on conclusions. For example, as mentioned above, state appropriations were divided between the teaching and administrative missions. It would be straightforward to assess the impact of changing the 80:20 ratio for this, or the inclusion of some of these funds to stimulate research. Similarly, it is not difficult to assess the impact of apportioning a different proportion of clinical time to teaching. In our original analysis, we allocated 50% of clinical time to teaching if both activities were occurring simultaneously. As shown in Figure 6, we recalculated the operating margin for the school of medicine by assuming all clinical time was applied to the clinical mission, and none to teaching. Comparing Figure 6 with Figure 1, which utilized a 50% allocation of clinical time to teaching if teaching of students was occurring, the operating margin is improved in the teaching mission and decreased in the clinical mission, reflecting the impact of faculty (and related) expenses on the clinical mission. This difference suggests the degree of cross-subsidization from the clinical mission that is needed to support the teaching mission.

In conclusion, over the two years during which the MAP system has been used, we have gained much useful insight into the budgetary challenges faced by the medical school. The data have been useful in the allocation of state resources to incrementally correct maldistributions caused by historical methods that no longer reflect actual contributions to the school’s activities or their desired strategic directions. As a simple, inexpensive tool, it has been easily integrated into the budgeting and planning process and has served to inform strategic decision making and resource allocation.

This article was originally published in the November 2005 issue of Academic Medicine.

References

1 Nutter DO, Bond JS, Coller BS, et al
Measuring faculty effort and contributions
in medical education. Acad Med. 2000;75
199–207.


2 Daugird AJ, Arndt JE, Olson PR. A
computerized faculty time-management
system in an Academic family medicine
department. Acad Med. 2003;78:129–36.


3 Watson RT, Romrell LJ. Mission-based
budgeting: removing a graveyard. Acad Med.
1999;74:627–40.


4 Whitcomb ME. Mission-based management
and the improvement of medical students’
education. Acad Med. 2002;77:113–14.


5 Howell LP, Hogarth MA, Anders TF.
Implementing a mission-based reporting
system at an Academic health center: a
method for mission enhancement. Acad Med.
2003;78:645–51.


6 Ridley GT, Skochelak SF, Farrell PM.
Mission-aligned management and allocation:
a successfully implemented model of
mission-based budgeting. Acad Med. 2002;
77:124–29.


7 Mallon WT, Jones RF. How do medical
schools use measurement systems to track
faculty activity and productivity in teaching?
Acad Med. 2002;77:115–23.


8 Brigham EJ, Tellers CA, Rondinelli R.
Academic survival through mission-based
management. Am J Phys Med Rehabil. 2001;
80:778–85.


9 Garson A, Strifert KE, Beck JR, et al The
metrics process: Baylor’s development of a
“report card” for faculty and departments.
Acad Med. 1999;74:861–70.


10 Cramer JS, Ramalingam S, Rosenthal TC, Fox
CH. Implementing a comprehensive relativevalue-based incentive plan in an academic family medicine department. Acad Med. 2002;75:1159–66.


11 Bardes CL, Hayes JG. Are the teachers
teaching? Measuring the educational activities
of clinical faculty. Acad Med. 1995;70:
111–14.


12 Hilton C, Fisher Jr, W, Lopez A, Sanders C. A
relative-value-based system for calculating
faculty productivity in teaching, research,
administration, and patient care. Acad Med.
1997;72:787–93.


13 Coleman DL, Moran E, Serfilippi D, et al
Measuring physicians’ productivity in a
Veterans’ Affairs Medical Center. Acad Med.
2003;78:682–89.


14 Howell LP, Hogarth M, Anders TF. Creating
a mission-based reporting system at an
Academic health center. Acad Med. 2002;77:
130–38.


15 Ruedy J, MacDonald NE, MacDougall B.
Ten-year experience with mission-based
budgeting in the faculty of medicine of
Dalhousie University. Acad Med. 2003; 78:
1121–29.


16 Johnson SE, Newton WP. Resource-based
relative value units: a primer for academic
family physicians. Fam Med. 2002; 34:
172–6.